Cooling the CoolBlue build. It should be very easy, right?

How did we get here?

This started easy enough, add an All-In-One cooler to CoolBlue. Yeah, not so much. Oh, everything leading up to the change went well, but it wasn’t until getting our hands dirty that things took a turn. Let’s start from the beginning.

I had a Corsair H100i CPU cooler not being used, but it wouldn’t fit as it was, so I went shopping. I had to find a retro kit, which in this case was four slightly shorter stand off posts. That was easy enough, they are widely available and not unreasonably priced. Not unreasonable until you factor in shipping.

The Corsair site had these for 7.99 USD, great, let’s order them. 9.99USD for shipping. What? They cost more to ship than the item cost? You’ve lost your pea picken’ mind! There were several sellers, however, on eBay, so that’s where we ordered. Four days later, things looked ready to go.

I chose to reuse the DeepCool fans already in the case, which was fine, but as it turns out, the case has a flaw. The fans that mount to the top of the case, can only mount in the center. Many case manufacturers add additional grooves so things can be pulled closer to the glass panel and not have an issue with the height of cooling blocks on the motherboard. This case lacks that. My next option was mount the radiator behind the front panel.

A Different Problem

Technically it fits, but it looks awkward. The front has three fans mounted and now the top two reach into the case further, making things look sloppy. There is also the matter of the CPU cooling block having a lot of extra cables causing clutter. Still, I could look past that if it kept things nice and cool. It didn’t.

Okay, it did keep things cool, but only as much as the tower cooler I took out. Not a good trade. It looked bad from the front, more cables to manage in the back, and now the temps were only slightly better. By slightly better, I mean margin of error better. I was able to get slightly better temps by removing the front panel, but that’s not ideal. So, you’re thinking, a horrible fail.

Yes, and No. It was a failure, but I learned something. What not to do.

What’s next for CoolBlue

The very next thing to do is remove the All-in-One cooler from the build. Aesthetically, it’s not pleasing and it’s also not effective.

Next, I will look at the big brother to the current AG400 CPU cooler, the AG620. This is rated for a higher total power output, and although it’s more than that CPU needs, it pushes more air directly out of the case. It may only mean a few degrees cooler, but to be honest, temps weren’t bad, I was just looking to improve them so I wouldn’t have any throttling when I edit.

This also may mean a different case. CoolBlue gets it’s name because is a Intel CPU/GPU build with DeepCool parts, (including the case), but I did put a CoolerMaster power supply in. I may be able to do the same with the case and still keep the theme. I may even be able to find a blue case. If I find one with better airflow, that can solve all of the problems at one time, and that would be cool. No pun intended. After that it will be the process of swapping SSD’s and loading programs I use most often, then it may be ready to become my new every day driver. We’ll see.

Link to the YouTube video

Back to the Blog Page

The Amazing Lenovo M700. Slow and Dirty Edition

How did we get here?

That part is actually easy enough. My friend Tom has a Lenovo PC he ‘inherited’ from work and it needed some attention. It came complete with an intel i5 6500 four core beast of a processor, 8GB or memory, a 640GB Western Digital Blue spinning hard drive and a fair amount of dust. The dust wasn’t the problem. Okay, it was a problem, but a very small one.

The big problem was the hard drive. I’ve been working with computers a long time and never seen a 640GB drive in the wild. To top it off, the drive had a lot of data, so a 500GB was out of the question. To anyone that hasn’t been SSD shopping lately, prices have fallen, drastically, so now may be a good time. I found a 1TB TeamGroup drive for around 50 bucks. Then, it was time to clone. We’ll get back to that in a minute.

Dusting didn’t take long. I got out the electric duster and blew out the big stuff, then turned my attention to the fans, and repasted the CPU cooler. It didn’t have heat problems, but I erred on the side of caution. A microfiber cloth and some alcohol wipes, and it was cleaned up and ready for the next steps. I needed a video card, and to replace the antenna on the wifi card. Taking care of the drives would be easy, and it would be ready to give back. I was wrong. Very, very wrong. The drives were not easy, but we’ll get back to that.

Starting off with the easy steps

The final step mechanically was to add in a graphics card. Nothing fancy, Tom said he just needed an HDMI port, and NVidia’s GT1030 has that. To any that may be reading this, and not familiar, the GT1030 was the low end of the 10 series cards from NVidia. After its release with 2GB of DDR5 memory, NVidia cut corners on future models by using DDR4 instead. There is a very real difference between DDR4 and 5. DDR5 is twice as fast, for one. Still, this fit the need. This isn’t a gaming PC, it just needed a tune up and an HDMI port, which the 1030 definitely has.

With the CPU in good shape and the graphics card installed, it was time to tackle the hard drive issue. Total time for this thing to boot up was about three minutes. It should have taken less, but the BIOS was set up to boot with a network connection. The problem? It wasn’t on a network to boot from. The HDD did have an operating system, though.

Every boot first went through the network process, then checked the HDD, which sounded bad. It almost sounded like scraping on something. I’m sure most of you have heard the sound of a drive clicking as it searches. This one sounded tired. It was time for the new SSD. Luckily, I now had one. I just had to clone the old one.

The cloning process

Did you know there is no company that offers a true open source cloning program? They all either make you pay up front, sign up for an elaborate scheme to get your data, offer only a small trial period, or a very limited set of utilities until you subscribe to a monthly program. Hmmm, Yeah, that’s the reason I don’t use Adobe products, either. Then, I remembered the only thing good about that 640GB drive. It was a WD. That means it has a version of Acronis specifically for Western Digital drives. Nice. I downloaded the software and I was in business. I chose what type of cloning and was ready to start.

Then came the real problem. The drive sounded like it was on it’s last leg, and I wasn’t 100% sure it would even survive the cloning process. That wasn’t the only issue, though. It was going to take five hours to clone. Five. There was no way this was going to survive five hours of activity. I started the process and left it over night expecting an error message when I woke up. Instead, I was greeted with success. Nice! I unplugged the HDD and tried it. After the network attempt, it booted into the SSD. We were doing well, so far.

The next step was to get into the BIOS and redirect the boot order to the SSD. I plugged the HDD back in and it promptly failed. It booted first to the network, then from the HDD. What? Seriously? But I already changed the boot order! The flipping thing had four different boot up scenarios in the BIOS. Completely ridiculous. This was definitely a former office PC.

The finished Lenovo

I set all of the boot orders the same. Check for a USB device first, then move to the SSD, then to the HDD. I saved, exited, and waited for it to punk me again. It booted straight from the SSD in about twenty seconds. Finally, we were in good shape. There were some final touches cleaning, a new antenna for the network card, and we were truly in business. This thing had new life.

Then curiosity took over. I chose to test Shadow of the Tomb Raider and CyberPunk, and wasn’t surprised by it’s lack of achieving even twenty frames per second, but that was never the goal of this computer. It will be primarily used as a creative tool, and not need to be the end all/ be all PC. There’s no need to equal to a two thousand dollar custom PC, or even a budget gaming PC. It just has to be solid, start quickly, and output using HDMI.

After running it through a few tests and making sure the graphics driver was good, I gave it my blessing. With any luck, this PC will last several more years. Or, at least long enough for Tom to get all of the use he needs from it. Now, it’s on to the next project.

The video can be found Here.

Back to the blog page

CoolBlue Gets a Huge Upgrade – So Much Better!

How did we get here?

It started simple enough with only wanting to build a PC with two brands. CoolBlue was a combination of DeepCool parts and Intel, but it was incomplete. I wasn’t using an available intel NVMe drive and the fans were not all addressable RGB. Sometimes, I don’t leave well enough alone. The link to that blog is here

While looking for said parts, I found a price drop on a 12700KF from intel. It was a good deal, but I was only interested if I found an equally good deal on the motherboard. Damn. I found one. That meant if it turned out better than my editing rig, I had work to do. Swapping platforms is a huge deal.

There was only one way to find out if the combination would be better, so I got to work. The parts came in and this time the build went much quicker than the last effort. With everything installed, including the new NVMe and fans, I started testing. So far, the tests have only included the Arc A750, intel’s graphics card, but the CPU intensive tests told the story.

The testing

On a free benchmark render tool called Cinebench R23, I saw what amounted to a 50% uplift in performance over my current setup. This test renders a complex image using only the Central Processor with no help from the Graphics. One and a half times better performance out of the 12700KF was far more than I could have planned. It was time to run the game benchmarks.

Not having run benchmarks on the Ryzen 7 5800X combination with the Arc A750, I had to rely on uplift from the intel 11400 tested previously. The change was not only noticeable, but I was getting results from the A750 that before had been failed tests. It just ran better with the new CPU. The A750 wasn’t good enough to be my daily driver, but it was obvious, the new CPU was.

I used a DeepCool AG400 tower cooler, which will handle most activity well. And, with four PC120 fans, temperatures were very stable during most testing. It throttled during CPU stress tests. Yes, the same R23 test that measured a 22000 score did so with a red CPU light on. It was impressive.

This is a great processor, and better than what I’m using. I now had a conundrum.

Is CoolBlue my new everyday rig?

My set up with I Am Number Four is very solid and stable. It’s been my workhorse for the channel, and my growing social media effort. Changing, would require me to swap a lot of drives, and programs over. I was hesitant. I’m still hesitant.

I will need to find which drives to move, and what to do with I Am Number Four, but CoolBlue, will indeed become my editing, streaming and gaming rig. You don’t realize how comfortable you are with something until you go to change it. I still don’t use the gaming setup in the other room, so I will be breaking that PC down as well. I didn’t see that coming, but I guess I’m crossing a threshold of sorts.

This is a very capable PC and will be a great next step in handling everything I need to throw at it. And, for the first time in a very, very long time, my ‘Go To’ rig will be an Intel. I will, of course change out the graphics card and the power supply, but it will be a Cooler Master, so the name will stay the same. Paired with an RTX 3060Ti, there won’t be much it can’t handle. Welcome to the family CoolBlue.

The YouTube video can be found here.

Back to the Blog page

Intel’s new GPU and CPU, match made in heaven?

How did we get here?

It started when I thought it would be a good idea to buy the Intel Arc GPU to test. I picked up the A750 for a reasonable price, (which is about to drop), and went to work. Testing did have it’s issues and there were some bumps in the road, but I got results that were helpful. I then turned my attention to building and testing a rig with both Intel CPU and GPU in it, and that wasn’t so easy.

There were issues recognizing the GPU and an NVMe drive, but the primary problem was acting like something altogether different. After realizing a BIOS update was necessary, I was back on track, but nowhere near as excited. Links to the BLOG and VIDEO, in case you are interested. It was some work, but it did work.

I wasn’t able to complete benchmarks because of issues with time, but I needed to post the video. So, I went back to work on the same rig after posting, and ran into more issues. I was already aware that some games were not going to run properly, so I left them off the list. I was also aware that one game just takes forever to benchmark, so I left it off as well. Bad choice. Benchmarking on that is going on as I write this.


One of my biggest issues was that the game with the Vulkan graphics API won’t select that protocol, forcing me to stick with DX11, a known shortcoming. In fact I had problems with another title that has DX 11, Borderlands 3. It’s a game I have thought about dropping, but now I may actually have to.

I was testing on DX11 and DX12, and it just quit. Period. Won’t run. It shows the splash screen then shuts down completely and won’t start up. I have a similar problem trying to run this game on my editing rig, but Borderlands3 now defaults to DX12 and I have several games with that setting. It’s great to benchmark, but it may be time to retire it.

Other issues included World War Z where it caps the framerate and I get horrible artifacting, and Horizon Zero Dawn has the same framerates no matter the actual resolution. The latter of the two was confusing during my tests with the Ryzen CPU, so I was expecting it, but it shows Intel still has some work to do on their drivers.

What’s next for the Arc GPU?

I’m not done with this one yet, though I decided that CoolBlue (intel test rig) needs some changes. Instead of just changing a fan and getting an Intel NVMe to bring this closer to a true DeepCool/Intel build, I ordered a new platform. For the first time in over ten years, I am probably going to make my main rig an Intel.

I will be changing the GPU, mind you, but I will test this A750 with a 12700KF in a brand new B760 motherboard. The choice wasn’t made lightly. I was going to go with AM5, but recent problems with motherboards overvolting these CPUs had me turn to something already proven for my new build. But that’s another story…..

The fun in assembling a great Intel Test Rig

How did we get here?

The first part of building a solid intel test rig, is of course, the platform. The i5 11400F was my choice for the CPU. It has six cores and twelve threads and is a great midrange processor. Recently purchased was an Arc A750 graphics card, so all I needed was the support material and cooling. I have a DeepCool case, and recently picked up a set of DeepCool RGB fans, so a plan was forming. The power supply and storage were also ready, so I gathered everything I needed, and was off to a great start.

Assembly wasn’t difficult. The case is great to work in and the fans fit perfectly. There was already an NVMe drive installed, and except for an errant motherboard screw, things fit well together. The second step was going well. That’s where the fun stopped, though. The motherboard chosen had a BIOS from before the Arc A750’s release, and it wasn’t until a fair bit of troubleshooting, that I discovered this.

The NVMe drive also didn’t show up as a boot drive, although it had a new copy of Windows 10 Pro. Troubleshooting one problem at a time is fair, but this masqueraded as a DRAM problem and CPU issue, instead of the GPU. One complete disassembly and reassembly later, I thought of the BIOS. Achievement unlocked.

I still had no drive showing up, so I tried to repair Windows. I don’t know why there is a repair utility in the Media Creation tool. It never works, EVER. Time to reinstall it instead. Nope, the NVMe already had a master boot record. Seriously? BIOS says it doesn’t (even after the update) and the Win Utility says it does. This isn’t going great anymore.

Day Two

With a cooler head, I came back, grabbed an SSD with no data, and started over. To my amusement, everything worked. I then started with my normal installs only to realize that if I were going to do this right, I would need an Intel NVMe. Stop installing stuff, Paul. And, while you’re at it, order another DeepCool fan to balance out the RGB. Done. Time to edit what I had, and post it on YouTube.

Everything wasn’t done yet, but there was enough material for the build video. The video was up, so I went to dinner. I then realized that I forgot the thumbnail. Oh, I made one, I just forgot to use it. The frustration finally got me. It would be hours until I could fix it, but it couldn’t be helped. It would wait.

Finishing the Intel Test Rig

All I had to do was wait for the new drive and the extra fan and I could start benchmarks. What? A notice about my order? Delayed? What the……?

To be continued. In the meantime, check out the Arc A750 GPU post.

The HP Franken Dell. Scary project or a outstanding idea?

Not everything is a bad idea.

So, recently I tried a few fairly standard upgrades on an old Dell Optiplex. They were simple, but effective, and all was good. So, how did we get from there, to the HP Franken Dell? There were a few steps.

Over a year ago, I ordered the cheapest DX12 capable prebuilt PC on Amazon. It was a refurbished Dell Optiplex with an i5 4670 four core, four thread processor and no dedicated graphics card. It did come with dual channel memory, but only 8GB. Upgrades were relatively simple. More capacity on the RAM, an SSD instead of the hard disk drive and a graphics card. It wasn’t the best Gaming PC in the world, but for about 330 USD, accounting for price drops, it was solid. One of the videos benchmarking current games can be found here.

Then came the idea of upgrading the CPU, a Xeon. The choice of the Xeon came from the natural upgrade path (i7 4770) being more expensive. The E3 1270v3 has almost the same clock speed and the same number of cores and threads. It was also only 28 bucks and some change.

The surprise came when it turned out not to be an upgrade because of the GPU. A better GPU meant needing more room, and while I was at it, I might as well, pick up the motherboard for another 18. Nice.

Not an Optiplex anymore

Freed from the Optiplex case, I could test the Xeon with other video cards and find that the four core, eight thread actually performed very well, for a ten year old processor. The problem was, it was stuck in a lousy case. The temps were a bit high on the GPU side, but not horrible. I just wasn’t sure what I wanted to do with it.

At the same time, I was considering turning the old HP a6512p into a bit of a sleeper build. The fifteen year old PC wasn’t very capable of playing modern games, even with a decent GPU, but it did have a redeeming quality. A standard mATX fit inside and it took a regular size Power Supply. The same size motherboard that the Xeon called home. Hmmm. Nah, but maybe, just maybe, it would work.

I knew there were going to be a few issues; the front panel accessories being one of them. The front panel lacked the USB3 available from the newer Mobo and there was a firewire (1394) connection that was useless. An expansion bay including USB3 was cheap, so it made the list. Also on the list was something to help with the adapters I had already purchased that worked for the pink case.

Finally, the HP Franken Dell.

When it was all said and done, it worked. Temps weren’t great, but I know there isn’t a lot of airflow in that case, so it’s something to work on. Maybe a Noctua fan will move more air and be more quiet. As it stands, I have some room, but not a lot, for more drives and I have to consider if the FrankenDell will be a better solution for my server that I keep putting off. Noise and temps first, then I will explore more drives.

Overall, it was a fun experiment. Parts fit where maybe they shouldn’t have, and I can see why companies like HP and Dell try to now make proprietary parts. (Though, some of their parts could use a good swap). I don’t yet know what will become of the older HP motherboard and Q6600 that where originally in the 6512p, but I also have an AM3+ motherboard around somewhere without a home. Hmmm, I wonder if I should get a FX processor and pit it against the intel CPU’s like the Xeon. To be continued………maybe.

Link to the YouTube video about the HP FrankenDell.

And of course, back to the blog section

Intel Arc Graphics. Outstanding, Good, or Trash?

How did we get here?

Intel dominates part of the Central Proccessing arena, but what about Graphics? For several years there have been two major players in the GPU space, NVidia and AMD. They’ve had decades to refine that craft and both have strong products. NVivida has dominated until recent years, but what about a third? Can someone else come in and compete with two established jugernaughts?

Enter the Arc line of GPU’s. It’s Intel’s first foray into the market in a dedicated PCI slot, and the first dedicated card since 1998’s AGP card. They have had integrated graphics built into motherboard chipsets, and processors for decades, but stayed clear of discrete cards. That is, until lately, with Xe graphics.

Okay, but Intel?

Having a small You Tube channel gives me an opportunity to test many different PC parts. Most are mainstream, some a little odd, but which category does the Arc GPU fit into? I didn’t have an interest in that question until I found out Intel looks to continue making GPUs. At the time of writing this, there are many unconfirmed rumors that Intel will indeed continue for at least two more generations.

Intel has three discrete cards, the A380, A750, and the A770, with all three being reasonably priced. I chose the 750 because it has fair competition in the same price range. NVidia has the RTX 3050 and RTX 3060, while AMD has the RX 6600 and 6600XT, both also have older products. With an RTX 3060 and RX6600XT both on hand, I had test material. If only it were that easy.

First impressions of the card were favorable. It’s sleek, appears to be well built and beautiful to look at. Other things about it aren’t as compelling. The drivers are cumbersome, though I understand they’re much improved, and the need for Resizable BAR ( a direct storage option) required a bios update. It also required digging through outside source material to configure the bios before I could reinstall the Graphics drivers. Set up wasn’t painless.


It took several hours, but I was finally able to start testing. With the proper drivers installed, I spend some time repeating the same exercises that gave me issues after the first attempt, and I was pleased to find things finally working. No locked screens, no blackouts, and no error message saying resizable BAR wasn’t enabled. It was finally time to test games.

The Arc performed well in Direct X12 titles, but was I not prepared for the nose dive with the Direct X 11 API. The card stumbled out of the gate on just the second game, Borderlands 3. It typically runs better on the older API than the new one, but is always within margin of error. Results didn’t get above 77 frames pers second in DX11, where the other cards approached 200. I knew it struggled, but not that much.

So, is the Intel Arc GPU junk?

Not at all, but it does still need some work. The performance in Direct X12 matches the other cards, and switching Borderlands 3 to DX12 saw it beating the 3060. I need another example of a Vulkan title, but what I’ve seen so far is promising. Temperatures stay in the low to mid 70s, the clock speed maintains 2400 MHz. Menus and apps are responsive with no delay or hang ups. Overall performance is good, with no artifacting or glitches. It also looks nice with smooth lines and a tasteful white LED.

AMD and NVidia options straddle the Intels in both price and performance, and although Intel has no high end offering, it wasn’t long ago that AMD only targeted the mid range market. The A750 retails at $250, placing it firmly in the mid tier of GPUs. It’s a solid card and I look forward to testing others.

The video can be found here

Back to the blog page

Buy a new laptop or dust off the old one?

How did we get here?

This all started when I was getting a bit frustrated with my PC set up in the office and needed a break. I pulled out my Dell Inspiron laptop from 2018 and began working on my blogs, and eventually, slides. I carry it with me occasionally, but almost never use it because of an annoying popping sound from the sound driver, and the bottom cover won’t stay on without some exterior help – tape. Tape is messy.

In any case, I had just gone through testing the new Xeon to compare with the older Optiplex and realized I had another four core, four thread older CPU. You guessed it, the laptop. The Dell Inspiron in my possession is a model 15 7567 from 2018 and comes with a GTX 1050 dedicated graphics card. When it came out it was a decent gaming Laptop. Now, it ranks faster than about 5% of gaming laptops. It’s only five years old. I say only five, but computer years are sort of like dog years.

Still, the Optiplex was only a bit older and it’s a good gamer. I’ve been able to show on the channel that with a little TLC, it performs well even in newer titles. The laptop, coming in with a processor two generations newer should be fine. The Inspiron comes with a 7300HQ processor and although it doesn’t meet the specs for Windows 11, neither does the Optiplex. (Who comes up with these names, by the way?) Inspiron for personal use and Optiplex for business use, I get it, but … different blog. The laptop also has a 1080P IPS screen, so I was able to test at full 1080P resolution.

Testing the laptop

My first thought was to compare it to the Optiplex with the newer RX6400, but that didn’t seem fair. I opted instead to test against the RX560 from that period. The i5 4670 probably did see some pairings with the RX560 4GB video card, and obviously the 7300HQ saw some pairings with the GTX 1050. It’s a match that was likely able to be compared at one point.

I then realized that the data I had from the RX560 was about a year old because I used that GPU in another build. So, even though the laptop needed driver updates, (lack of use), I chose to not to. I want to keep the playing field even. Both had 16GB of memory, but the Optiplex came equipped with DDR3 instead of the laptop’s DDR4. Speeds were 1600 and 2400 respectively, which should be close enough. After all the Xeon handled most every game I tested at more than 60 fps with and older RX480. It did even better with an RX 6600XT.

Both of those were written about in earlier blogs , but as we get deeper into the tech side of these, I will provide more detail. In this case, my results ranged from 17-20 frames per second in more difficult games like CyberPunk, to over sixty frames per second in Forza Horizon 4, all at 1080P, without adding any resolution enhancement. The 1050 is too old for DLSS, but will work with AMD’s FSR. It was not able to use raytracing. Just as well.

If it’s there, why not try it?

I did try a few examples of FSR, although I wasn’t able to compare against the RX560, but the 21 FPS in CyberPunk on 1080 High became 27, and the 28 fps on low worked out to 37 fps. Setting this to 1080P low with performance FSR and frame locked at 30fps, might just be the way to play a game like CPK. It’s ironic that AMD’s tech works on an NVidia card. Ok, funny, too. I have my own opinion on whether the big green machine cares about the consumer, RTX 4070 anyone?

AMD’s resolution tools do help this combo, some, but the problem again is VRAM. More and more games are using more GPU memory and it kills older cards. 8GB isn’t enough anymore, much less 4. Lower settings help, but even with updating all of the drivers, it’s still five years old. Newer budget gaming laptops eat this thing for an afternoon snack, and it doesn’t compare.

What’s next for the laptop?

First things first. I ordered a new bottom cover that will be here this week. No more tape. Second, I will clean up the 1TB drive currently in it and probably actually replace it with an SSD to hold some games and make it quicker. Then I’ll use it, or I won’t. I have Danny DD, my small form factor, but I carry my laptop to many of the same places out of habit. I need to be okay with one or the other. Drivers were updated as well, and that got rid of that annoying popping sound, though it still doesn’t sound great.

Whichever happens, I’ve discovered that I actually still like it. More than likely, I’ll use it a while to see what happens. I don’t know that I’d like a new laptop better, so I’m leery about dropping extra money, but I might end up taking it places more often. We’ll see.

RTX4070 for $600? Is it now the worst GPU ever?

How did we get here?

The RTX4070 is NVidia’s latest Graphics card and in a place where they could have reset the market and blown the doors off of both of their comptetitors, it seems they have pulled up short. By the many reivews that have come out, it appears this GPU had performance room taken from it during manufacturing. Why? It’s also priced fifty to a hundred dollars more than maybe it should.

This isn’t the first time a company has put out a GPU that doesn’t make sense. Nvidia themselves did it with the GTX1070Ti, and AMD most recently did it with the RX6500. The 1070Ti was launched in a very tight market when GPUs could still be easily overclocked and had to be kneecapped so it stayed under the more expensive 1080 and 1080Ti. The 1070Ti was $399 where the 1080 came out for the $599 price we see for this one.

AMD’s grand effort was the RX6500 that had a PCI bus configuration that was one quarter the size of many other cards. That may be fine for newer systems running on a 4th generation high speen bus, but PCI gen3 was half the speed. I mention that because many of these cards were being used in older systems because of price. A price that by many accounts was still too high around $230. Even at $200, it was a hard sell. The RX6400 was even worse, but had a low profile option. Even with a slower transfer rate, it was better than other LP options.

The RTX4070 advantage?

The new offering from NVidia does have a few things going for it. It sips power. In a time where the 4090 needs 600 watts of power, the 4070 gets by with a max of 200. That also means it runs cooler, and can be smaller. One of the largest complaints about the higher end 4000 series is the size of the cards. Larger power draw means more heat which needs a bigger cooler. The 4070 doesn’t have that challenge.

It also does have Deep Learning Super Sampling 3, which doesn’t exist on previous series cards, or AMD at all. AMD does have a similar tool in Fidelity FX Super Resolution, negating part of that advantage, though. One thing that can’t be negated is a superior video encoder. The NVenc encoder still reigns supreme, and although newer AMD cards carry the AV1 encoder, some platforms don’t support it. If the streaming platform doesn’t support a process, then that process, no matter how good it is, may be useless. And the AV1 encoder IS a great tool.

So the only real question is the price. Is $600 too much to ask for a card that barely beats the next tier from the last generation? AMD has superior cards for a similar price if you are just worried about frames per second, but they also draw more power and lack some of the featues. Still, they are great cards, and do make more sense. Well, if any of these cards make sense at current pricing.

My Conclusion?

The RTX4070 is not the worst ever. That honor still goes to the RX6500 from AMD, BUT it’s on the list. The GTX1070Ti may be neck and neck with it, but selling this card for fifty bucks less would have probably kept it from being considered at all. Something to think about is the $500 for the RTX3070 at launch, would now be about the $600 for the 4070 today due to inflation and current market conditions. Money doesn’t go as far as it once did. Still, we want to feel like we are getting our money’s worth.

We probably are, but it doesn’t feel like it.

Check out the YouTube video here.

Back to the Blog page

The most I can get out of the Xeon?

How did we get here?

Some may remember me testing the Xeon ‘replacement’ for the CPU in my Optiplex, that wasn’t a replacement. Long story short, the Graphics card was the issue and I need to do something else. I opted to throw it in another case with a different motherboard and better GPU. That was better, but I thought there was still more I could get out of it. But how much?

To find out, I borrowed the RX 6600XT from my gaming rig in the living room. It won’t stay in there, obviously, but I know it’s capable and I have some tests with similar CPUs. Its the most powerful GPU I can get to easily, so I put it in, expecting the pink case to now be an easy bake oven. I was wrong. Oh, so wrong.

Performance was better than expected and temps were perfect. The CPU and GPU both stayed in the mid to high sixties on the high end; something I never saw coming. The pink case has horrible airflow and hates anything more powerful than a Casio watch. For some reason, though, this combination loved it. I’m sure it will be better in a different case, but for testing, I’ll take it.

The important thing, though was finding where the CPU bottleneck would come, and test comparisons to the other processors. I say test comparisons, because in real world scenarios, paired with a mid-budget GPU playing games in 1080P, there is no difference. Admittedly, with the better GPU I also added more RAM, but the games never used it, so it wasn’t a factor. This thing is solid, and it will probably be the ‘go to’ PC for playing games in the bedroom, if I bother to set it up.

Xeon? for real

Getting one of these and testing against some of the newer four core. eight thread CPUs, was a great choice. It’s a much better than trying to hunt down and pay twice as much for an i7 4770, and it makes me want to see how the larger socket E5 Xeon’s do. That’s another experiment, though. This one goes down as a success, and I couldn’t be happier with the results. Talk about a surprise. Great performance, cheap price and low temps. There isn’t much more you can ask for.

One of these paired with an RX580 is a perfect budget Gamer and 1080P is no problem with anything you want to throw at it. Now it’s time to get my 6600xt out and maybe try a GTX1660. Hmm.

Back to the blog page