The fun in assembling a great Intel Test Rig

How did we get here?

The first part of building a solid intel test rig, is of course, the platform. The i5 11400F was my choice for the CPU. It has six cores and twelve threads and is a great midrange processor. Recently purchased was an Arc A750 graphics card, so all I needed was the support material and cooling. I have a DeepCool case, and recently picked up a set of DeepCool RGB fans, so a plan was forming. The power supply and storage were also ready, so I gathered everything I needed, and was off to a great start.

Assembly wasn’t difficult. The case is great to work in and the fans fit perfectly. There was already an NVMe drive installed, and except for an errant motherboard screw, things fit well together. The second step was going well. That’s where the fun stopped, though. The motherboard chosen had a BIOS from before the Arc A750’s release, and it wasn’t until a fair bit of troubleshooting, that I discovered this.

The NVMe drive also didn’t show up as a boot drive, although it had a new copy of Windows 10 Pro. Troubleshooting one problem at a time is fair, but this masqueraded as a DRAM problem and CPU issue, instead of the GPU. One complete disassembly and reassembly later, I thought of the BIOS. Achievement unlocked.

I still had no drive showing up, so I tried to repair Windows. I don’t know why there is a repair utility in the Media Creation tool. It never works, EVER. Time to reinstall it instead. Nope, the NVMe already had a master boot record. Seriously? BIOS says it doesn’t (even after the update) and the Win Utility says it does. This isn’t going great anymore.

Day Two

With a cooler head, I came back, grabbed an SSD with no data, and started over. To my amusement, everything worked. I then started with my normal installs only to realize that if I were going to do this right, I would need an Intel NVMe. Stop installing stuff, Paul. And, while you’re at it, order another DeepCool fan to balance out the RGB. Done. Time to edit what I had, and post it on YouTube.

Everything wasn’t done yet, but there was enough material for the build video. The video was up, so I went to dinner. I then realized that I forgot the thumbnail. Oh, I made one, I just forgot to use it. The frustration finally got me. It would be hours until I could fix it, but it couldn’t be helped. It would wait.

Finishing the Intel Test Rig

All I had to do was wait for the new drive and the extra fan and I could start benchmarks. What? A notice about my order? Delayed? What the……?

To be continued. In the meantime, check out the Arc A750 GPU post.

The HP Franken Dell. Scary project or a outstanding idea?

Not everything is a bad idea.

So, recently I tried a few fairly standard upgrades on an old Dell Optiplex. They were simple, but effective, and all was good. So, how did we get from there, to the HP Franken Dell? There were a few steps.

Over a year ago, I ordered the cheapest DX12 capable prebuilt PC on Amazon. It was a refurbished Dell Optiplex with an i5 4670 four core, four thread processor and no dedicated graphics card. It did come with dual channel memory, but only 8GB. Upgrades were relatively simple. More capacity on the RAM, an SSD instead of the hard disk drive and a graphics card. It wasn’t the best Gaming PC in the world, but for about 330 USD, accounting for price drops, it was solid. One of the videos benchmarking current games can be found here.

Then came the idea of upgrading the CPU, a Xeon. The choice of the Xeon came from the natural upgrade path (i7 4770) being more expensive. The E3 1270v3 has almost the same clock speed and the same number of cores and threads. It was also only 28 bucks and some change.

The surprise came when it turned out not to be an upgrade because of the GPU. A better GPU meant needing more room, and while I was at it, I might as well, pick up the motherboard for another 18. Nice.

Not an Optiplex anymore

Freed from the Optiplex case, I could test the Xeon with other video cards and find that the four core, eight thread actually performed very well, for a ten year old processor. The problem was, it was stuck in a lousy case. The temps were a bit high on the GPU side, but not horrible. I just wasn’t sure what I wanted to do with it.

At the same time, I was considering turning the old HP a6512p into a bit of a sleeper build. The fifteen year old PC wasn’t very capable of playing modern games, even with a decent GPU, but it did have a redeeming quality. A standard mATX fit inside and it took a regular size Power Supply. The same size motherboard that the Xeon called home. Hmmm. Nah, but maybe, just maybe, it would work.

I knew there were going to be a few issues; the front panel accessories being one of them. The front panel lacked the USB3 available from the newer Mobo and there was a firewire (1394) connection that was useless. An expansion bay including USB3 was cheap, so it made the list. Also on the list was something to help with the adapters I had already purchased that worked for the pink case.

Finally, the HP Franken Dell.

When it was all said and done, it worked. Temps weren’t great, but I know there isn’t a lot of airflow in that case, so it’s something to work on. Maybe a Noctua fan will move more air and be more quiet. As it stands, I have some room, but not a lot, for more drives and I have to consider if the FrankenDell will be a better solution for my server that I keep putting off. Noise and temps first, then I will explore more drives.

Overall, it was a fun experiment. Parts fit where maybe they shouldn’t have, and I can see why companies like HP and Dell try to now make proprietary parts. (Though, some of their parts could use a good swap). I don’t yet know what will become of the older HP motherboard and Q6600 that where originally in the 6512p, but I also have an AM3+ motherboard around somewhere without a home. Hmmm, I wonder if I should get a FX processor and pit it against the intel CPU’s like the Xeon. To be continued………maybe.

Link to the YouTube video about the HP FrankenDell.

And of course, back to the blog section

Intel Arc Graphics. Outstanding, Good, or Trash?

How did we get here?

Intel dominates part of the Central Proccessing arena, but what about Graphics? For several years there have been two major players in the GPU space, NVidia and AMD. They’ve had decades to refine that craft and both have strong products. NVivida has dominated until recent years, but what about a third? Can someone else come in and compete with two established jugernaughts?

Enter the Arc line of GPU’s. It’s Intel’s first foray into the market in a dedicated PCI slot, and the first dedicated card since 1998’s AGP card. They have had integrated graphics built into motherboard chipsets, and processors for decades, but stayed clear of discrete cards. That is, until lately, with Xe graphics.

Okay, but Intel?

Having a small You Tube channel gives me an opportunity to test many different PC parts. Most are mainstream, some a little odd, but which category does the Arc GPU fit into? I didn’t have an interest in that question until I found out Intel looks to continue making GPUs. At the time of writing this, there are many unconfirmed rumors that Intel will indeed continue for at least two more generations.

Intel has three discrete cards, the A380, A750, and the A770, with all three being reasonably priced. I chose the 750 because it has fair competition in the same price range. NVidia has the RTX 3050 and RTX 3060, while AMD has the RX 6600 and 6600XT, both also have older products. With an RTX 3060 and RX6600XT both on hand, I had test material. If only it were that easy.

First impressions of the card were favorable. It’s sleek, appears to be well built and beautiful to look at. Other things about it aren’t as compelling. The drivers are cumbersome, though I understand they’re much improved, and the need for Resizable BAR ( a direct storage option) required a bios update. It also required digging through outside source material to configure the bios before I could reinstall the Graphics drivers. Set up wasn’t painless.

Finally

It took several hours, but I was finally able to start testing. With the proper drivers installed, I spend some time repeating the same exercises that gave me issues after the first attempt, and I was pleased to find things finally working. No locked screens, no blackouts, and no error message saying resizable BAR wasn’t enabled. It was finally time to test games.

The Arc performed well in Direct X12 titles, but was I not prepared for the nose dive with the Direct X 11 API. The card stumbled out of the gate on just the second game, Borderlands 3. It typically runs better on the older API than the new one, but is always within margin of error. Results didn’t get above 77 frames pers second in DX11, where the other cards approached 200. I knew it struggled, but not that much.

So, is the Intel Arc GPU junk?

Not at all, but it does still need some work. The performance in Direct X12 matches the other cards, and switching Borderlands 3 to DX12 saw it beating the 3060. I need another example of a Vulkan title, but what I’ve seen so far is promising. Temperatures stay in the low to mid 70s, the clock speed maintains 2400 MHz. Menus and apps are responsive with no delay or hang ups. Overall performance is good, with no artifacting or glitches. It also looks nice with smooth lines and a tasteful white LED.

AMD and NVidia options straddle the Intels in both price and performance, and although Intel has no high end offering, it wasn’t long ago that AMD only targeted the mid range market. The A750 retails at $250, placing it firmly in the mid tier of GPUs. It’s a solid card and I look forward to testing others.

The video can be found here

Back to the blog page

Buy a new laptop or dust off the old one?

How did we get here?

This all started when I was getting a bit frustrated with my PC set up in the office and needed a break. I pulled out my Dell Inspiron laptop from 2018 and began working on my blogs, and eventually, slides. I carry it with me occasionally, but almost never use it because of an annoying popping sound from the sound driver, and the bottom cover won’t stay on without some exterior help – tape. Tape is messy.

In any case, I had just gone through testing the new Xeon to compare with the older Optiplex and realized I had another four core, four thread older CPU. You guessed it, the laptop. The Dell Inspiron in my possession is a model 15 7567 from 2018 and comes with a GTX 1050 dedicated graphics card. When it came out it was a decent gaming Laptop. Now, it ranks faster than about 5% of gaming laptops. It’s only five years old. I say only five, but computer years are sort of like dog years.

Still, the Optiplex was only a bit older and it’s a good gamer. I’ve been able to show on the channel that with a little TLC, it performs well even in newer titles. The laptop, coming in with a processor two generations newer should be fine. The Inspiron comes with a 7300HQ processor and although it doesn’t meet the specs for Windows 11, neither does the Optiplex. (Who comes up with these names, by the way?) Inspiron for personal use and Optiplex for business use, I get it, but … different blog. The laptop also has a 1080P IPS screen, so I was able to test at full 1080P resolution.

Testing the laptop

My first thought was to compare it to the Optiplex with the newer RX6400, but that didn’t seem fair. I opted instead to test against the RX560 from that period. The i5 4670 probably did see some pairings with the RX560 4GB video card, and obviously the 7300HQ saw some pairings with the GTX 1050. It’s a match that was likely able to be compared at one point.

I then realized that the data I had from the RX560 was about a year old because I used that GPU in another build. So, even though the laptop needed driver updates, (lack of use), I chose to not to. I want to keep the playing field even. Both had 16GB of memory, but the Optiplex came equipped with DDR3 instead of the laptop’s DDR4. Speeds were 1600 and 2400 respectively, which should be close enough. After all the Xeon handled most every game I tested at more than 60 fps with and older RX480. It did even better with an RX 6600XT.

Both of those were written about in earlier blogs , but as we get deeper into the tech side of these, I will provide more detail. In this case, my results ranged from 17-20 frames per second in more difficult games like CyberPunk, to over sixty frames per second in Forza Horizon 4, all at 1080P, without adding any resolution enhancement. The 1050 is too old for DLSS, but will work with AMD’s FSR. It was not able to use raytracing. Just as well.

If it’s there, why not try it?

I did try a few examples of FSR, although I wasn’t able to compare against the RX560, but the 21 FPS in CyberPunk on 1080 High became 27, and the 28 fps on low worked out to 37 fps. Setting this to 1080P low with performance FSR and frame locked at 30fps, might just be the way to play a game like CPK. It’s ironic that AMD’s tech works on an NVidia card. Ok, funny, too. I have my own opinion on whether the big green machine cares about the consumer, RTX 4070 anyone?

AMD’s resolution tools do help this combo, some, but the problem again is VRAM. More and more games are using more GPU memory and it kills older cards. 8GB isn’t enough anymore, much less 4. Lower settings help, but even with updating all of the drivers, it’s still five years old. Newer budget gaming laptops eat this thing for an afternoon snack, and it doesn’t compare.

What’s next for the laptop?

First things first. I ordered a new bottom cover that will be here this week. No more tape. Second, I will clean up the 1TB drive currently in it and probably actually replace it with an SSD to hold some games and make it quicker. Then I’ll use it, or I won’t. I have Danny DD, my small form factor, but I carry my laptop to many of the same places out of habit. I need to be okay with one or the other. Drivers were updated as well, and that got rid of that annoying popping sound, though it still doesn’t sound great.

Whichever happens, I’ve discovered that I actually still like it. More than likely, I’ll use it a while to see what happens. I don’t know that I’d like a new laptop better, so I’m leery about dropping extra money, but I might end up taking it places more often. We’ll see.

RTX4070 for $600? Is it now the worst GPU ever?

How did we get here?

The RTX4070 is NVidia’s latest Graphics card and in a place where they could have reset the market and blown the doors off of both of their comptetitors, it seems they have pulled up short. By the many reivews that have come out, it appears this GPU had performance room taken from it during manufacturing. Why? It’s also priced fifty to a hundred dollars more than maybe it should.

This isn’t the first time a company has put out a GPU that doesn’t make sense. Nvidia themselves did it with the GTX1070Ti, and AMD most recently did it with the RX6500. The 1070Ti was launched in a very tight market when GPUs could still be easily overclocked and had to be kneecapped so it stayed under the more expensive 1080 and 1080Ti. The 1070Ti was $399 where the 1080 came out for the $599 price we see for this one.

AMD’s grand effort was the RX6500 that had a PCI bus configuration that was one quarter the size of many other cards. That may be fine for newer systems running on a 4th generation high speen bus, but PCI gen3 was half the speed. I mention that because many of these cards were being used in older systems because of price. A price that by many accounts was still too high around $230. Even at $200, it was a hard sell. The RX6400 was even worse, but had a low profile option. Even with a slower transfer rate, it was better than other LP options.

The RTX4070 advantage?

The new offering from NVidia does have a few things going for it. It sips power. In a time where the 4090 needs 600 watts of power, the 4070 gets by with a max of 200. That also means it runs cooler, and can be smaller. One of the largest complaints about the higher end 4000 series is the size of the cards. Larger power draw means more heat which needs a bigger cooler. The 4070 doesn’t have that challenge.

It also does have Deep Learning Super Sampling 3, which doesn’t exist on previous series cards, or AMD at all. AMD does have a similar tool in Fidelity FX Super Resolution, negating part of that advantage, though. One thing that can’t be negated is a superior video encoder. The NVenc encoder still reigns supreme, and although newer AMD cards carry the AV1 encoder, some platforms don’t support it. If the streaming platform doesn’t support a process, then that process, no matter how good it is, may be useless. And the AV1 encoder IS a great tool.

So the only real question is the price. Is $600 too much to ask for a card that barely beats the next tier from the last generation? AMD has superior cards for a similar price if you are just worried about frames per second, but they also draw more power and lack some of the featues. Still, they are great cards, and do make more sense. Well, if any of these cards make sense at current pricing.

My Conclusion?

The RTX4070 is not the worst ever. That honor still goes to the RX6500 from AMD, BUT it’s on the list. The GTX1070Ti may be neck and neck with it, but selling this card for fifty bucks less would have probably kept it from being considered at all. Something to think about is the $500 for the RTX3070 at launch, would now be about the $600 for the 4070 today due to inflation and current market conditions. Money doesn’t go as far as it once did. Still, we want to feel like we are getting our money’s worth.

We probably are, but it doesn’t feel like it.

Check out the YouTube video here.

Back to the Blog page

The most I can get out of the Xeon?

How did we get here?

Some may remember me testing the Xeon ‘replacement’ for the CPU in my Optiplex, that wasn’t a replacement. Long story short, the Graphics card was the issue and I need to do something else. I opted to throw it in another case with a different motherboard and better GPU. That was better, but I thought there was still more I could get out of it. But how much?

To find out, I borrowed the RX 6600XT from my gaming rig in the living room. It won’t stay in there, obviously, but I know it’s capable and I have some tests with similar CPUs. Its the most powerful GPU I can get to easily, so I put it in, expecting the pink case to now be an easy bake oven. I was wrong. Oh, so wrong.

Performance was better than expected and temps were perfect. The CPU and GPU both stayed in the mid to high sixties on the high end; something I never saw coming. The pink case has horrible airflow and hates anything more powerful than a Casio watch. For some reason, though, this combination loved it. I’m sure it will be better in a different case, but for testing, I’ll take it.

The important thing, though was finding where the CPU bottleneck would come, and test comparisons to the other processors. I say test comparisons, because in real world scenarios, paired with a mid-budget GPU playing games in 1080P, there is no difference. Admittedly, with the better GPU I also added more RAM, but the games never used it, so it wasn’t a factor. This thing is solid, and it will probably be the ‘go to’ PC for playing games in the bedroom, if I bother to set it up.

Xeon? for real

Getting one of these and testing against some of the newer four core. eight thread CPUs, was a great choice. It’s a much better than trying to hunt down and pay twice as much for an i7 4770, and it makes me want to see how the larger socket E5 Xeon’s do. That’s another experiment, though. This one goes down as a success, and I couldn’t be happier with the results. Talk about a surprise. Great performance, cheap price and low temps. There isn’t much more you can ask for.

One of these paired with an RX580 is a perfect budget Gamer and 1080P is no problem with anything you want to throw at it. Now it’s time to get my 6600xt out and maybe try a GTX1660. Hmm.

Back to the blog page

SSD excitement quickly turned sour. Not the right one.

How did we get here?

Imagine my surprise when I opened an older laptop and found an SSD. In fact it was an SSD in an M.2 slot. Now imagine how I felt when I realized it was a SATA drive. Yes, the same protocol that is on standard 2.5 inch drives, drove this one. Still, how bad could it be? I decided to test it and find out.

The first thing I had to do was actually find where I could test it. Almost all of my m.2 slots were taken on various rigs. I tried the PCI adapter, but you can see where that is going by my telling you it was a PCI adapter. You see, SATA means the serial instructions from the CPU. The PCI bus is Parallel. As I mentioned in the video, linked here, Serial means the data will transfer a single bit at a time, like loading a plane. Parallel will load several items at once, think roller coaster at the theme park that has a separate queue for each car. The Roller Coaster loads much faster, and will every time.

Still, all was not lost, Danny DD to the rescue. With an empty M.2 slot, I tucked it in, checked the BIOS, off we were. Sort of. Read and write speeds weren’t great. I ran Crystal Disk Mark and got about what I expected. 2.5 inch SSD type speeds. That doesn’t mean it’s horrible, or even unusable. You just have to have the right use for it. It’s still faster than an HDD, and a budget build is perfect for it.

Good SSD or Bad

That has to do with your frame of reference. The reason I mention these at all is that they are still common. They are available in various sizes and usually at or below the much faster NVMe drives that fit in the same slot. NVMe drives, also work with a PCI adapter if you don’t have a free slot on your motherboard. It’s not a bad choice, but there are much better ones. It’s best to read the description to make sure you get the right one.

SO, what’s next for the SATA m.2 drive. I’ll have to find an inexpensive motherboard and make an extreme budget build. It’s still faster than a spinning hard disk and perfect for a boot drive. It will fit perfectly in a low cost gaming PC. Paired with a larger 2.5 inch SSD, it will have a lot of life left. Now I need to go shopping.

Back to Blog page

The GPU is the Bottleneck? Again? Really?

How did we get here?

Recently, I tested the newly acquired E3 1270v3 Xeon in the Dell Optiplex as a possible replacement. I found it wasn’t an upgrade because of the GPU. Fair enough. I knew the RX6400 wasn’t a great card, but it’s what fit and it worked.

I then chose to get some parts and put the Xeon in a different case on a Dell motherboard. Perfect. So far, so good. The original plan was to compare the new benchmark numbers to the old to show how much more room we had to grow. As it turns out, it was quite a bit, and with a better Graphics card, it now turned in figures like newer hardware.

Well, that didn’t turn out the way I wanted. I expected the set up testing 1440P and 1080P on high, medium and low quality to show the Xeon struggling. It didn’t. In fact it was for the most part within margin of error of the newer CPUs. That wasn’t going to make for an informative video at all.

The Testing

So, I ditched the idea of the comparison with the older i5 and concentrated on testing against the 10105F and 11400F. Not brand new, but recent enough to compare. The 10105F is a four core, eight thread i3 from tenth Generation intel, and the 11400 is a six core, eight thread from 11th Gen. Both of these should be more than a match, so I tested with the mid range RX480 GPU, which was out around the same time as this Xeon. Obviously, the newer CPU’s have an advantage, right? Well, no.

As I mentioned the Xeon stayed within margin or error using the mid range GPU, so what could I do? I dropped the resolution. Instead of 1440P and 1080P, I dropped the settings down to test 1080P Low, 900P Low and 720P Low. Resolutions that the 11th gen chip has no business thinking about, because it would normally run with a better GPU.

Very much to my surprise, the 11400F actually lost to BOTH four core chips in some games. What? Seriously? That can’t be right.

But it was. on multiple games in repeatable tests. It traded blows with the 10th gen CPU in some games, but in others it actually performed the worst. Now, understandably, the six core CPU has no business thinking about running games at 720P and it shows, but that doesn’t solve my original problem. How do I get a fair test between the three processors?

Need a different GPU

Quite simply, I need to be able to test all three of these at a higher resolution, so I’m going to have to get use a better graphics card. The RX 6600XT is the easiest one for me to access and test with, so it will be next. To that point, seeing how close the two four core chips are, I will probably just compare the Xeon with the 11400F. I have records of a fair difference in the 10th gen and 11th gen using the 6600XT, so if I have to, I’ll include all three, but that should give us a better idea of how good this Xeon actually is.

I’m curious, just how well the older E3 1270 holds up to a far more recent product. This should be fun. The video will be linked here.

back to the Blog page

The Xeon results are outstanding.

For Starters

I originally put the intel four core, eight thread Xeon in the Dell Optiplex to see if it out performed the original CPU. It didn’t. More accurately, it performed near the same because the limit became the graphics card. The GPU, in turn, was limited by the size of the case, so what could I do?

The only good answer was to put it in a different case. That skews the experiment, though, because that puts it on a different platform that might help it run better. I wouldn’t know what made the difference. But, an Optiplex motherboard would assured that things would be relatively equal, and for 18 bucks, it was a no brainer. So, off to eBay I went.

Then came a different issue. It’s not the same connections. Adapter cables cost me another twenty seven, but they came next day. Sweet. Everything showed up, I installed all of the parts, and turned it on. To a great shock and surprise, it worked the first time. That isn’t a normal occurrence, so yes, I enjoyed it. Now, on to the testing.

I decided to match it not only against the older CPU in the Optiplex, but to test it against newer CPUs, six generations younger and again, was surprised. It held it’s own against a tenth gen i3 and 11th gen i5! It is even keeping up with a processor that has more cores and threads, the 11400F. Again, as it turns out, it’s only limited by the graphics card. We’re on a roll.

So, what’s next for the Xeon?

It performed great against the two newer processors with a mid range GPU, so the next step is a better GPU. The mid range is a very capable RX480, that has shown it’s worth even in 2023, but now we need more. I have an RX6600XT to use, or an RTX 3060Ti, and either should prove worthy. The AMD card is easier to get to and test with, so that will be next, but first I have to record this video. Technically, there are two more benchmarks, but there won’t be an issue, and I will set the stage for the next video with the results. This turned out far better than I expected.

Which brings me to my next thought. Do I post the result charts here, or not? Obviously, I won’t post all of the charts, but maybe a couple. I probably could do an average, but I don’t have combined numbers yet, so I don’t know what extra effort that is. Leave me some thoughts, and I’ll mull it over. In the mean time, I’ll post a few pics and we’ll wait for the video. It should be ready mid-week, then I can move on the next part.

I think I like this Xeon.

Converting a Dell Optiplex…. Interesting.

How did we get here?

I knew there may be more to this than meets the eye when I started, but it’s down the rabbit hole I go. First, I got the Xeon CPU to test. Perfect, so far. Then, after realizing it needed more ‘space’, I ordered an Optiplex 9020 motherboard because it was cheap. The board works beautifully, but there are some challenges.

The first is the power supply connections. I do have an adapter, but the PSU in question doesn’t seem too happy with it. Next is the front panel and power switch connections. Oh brother, more adapters. Lovely. I can do it without adapters, but there is a lot of jerry rigging involved. I went safe, and for only a few bucks here and there, we are still well below the price of a mainstream motherboard.

At what cost

The biggest challenge is staying below what a mainstream board would have cost, but I don’t want to set the house on fire trying to do it. It wasn’t on my list of goals for the week. It’s not on it for next week either, if you were wondering. I do have some parts, but the trick here will be to come in as close to $250 as possible and still have a solid gaming PC. Buying a mainstream motherboard would have killed that, but 20 bucks is solid.

Counting what I have on the shelf, we’re talking around 250 so far. If we ignore that, it’s much cheaper, but defeats the point. Using all of the parts versus buying an Optiplex in a bigger case might eventually still be a good comparison. The results of this experiment will tell me if I should try it. Maybe a standard sized PC and a can of spray paint would be just as effective. I may as well start looking for one to order.

So, what’s next for the Optiplex?

The next part is waiting for the adapters to come tomorrow, and making sure the PSU will work with what I already have. That in itself will be one video. After that, we start testing and see what it best compares to. I know I will compare numbers to the i5 4670 in the slimline, but I also want to compare numbers to a modern four core CPU to see what’s missing after six generations.

The ‘finished’ product will end up on the channel, here, but of course this will be where I discuss what went right and what went wrong, so if you’re reading this, please check back. Until then, I wait. I have begun to assemble the parts that are here, and I probably could do some testing, but then I wouldn’t have needed the parts, and I wouldn’t get to cover the adapters in the video. Six of one, half dozen of the other. It is bound to all work out some way.

Stay tuned, it should be fun.

Return to the Blog page