Pains of Mini-ITX powerhouse

If you are planning to build one; don't.
But if nothing is going to stop you either way, here you will find some tips.
My entire life, I was building standard-size ATX builds, which were often heavy, requiring a lot of space on the floor or on the desk.
However, these builds have quite a significant upside, a lot of space inside the case for ease of build, future upgrades and off-the-shelf parts just working.
All this changed when I decided I want a Mini-ITX platform, so It could in theory fit in my backpack (not that I'm going to put it there). After searching for some popular cases, I ended up with the FormD T1 case, and that is where all the fun started.
The build started at the end of November 2022, when GPU supply shortages were easing and new AMD platform has been announced and parts started coming out.
Final part list:
Case: FormD T1 V2
Mainboard: ASUS ROG Strix X670E-I Gaming WIFI
Processor: AMD Ryzen 9 7950x
Graphics card: Sapphire Radeon RX 7900 XTX 24G
Memory: Kingston KF560C40BBK2-32
Storage: 2x Solidigm P44 Pro 2TB
Power supply: Corsair SF750
Fans: 2x NF-F12 PPC 3000 PWM
Processor water blocks: EK-Quantum Magnitude AM5
Graphics card water block: EK-Quantum Vector² RX 7900 XTX
Radiator: Alphacool NexXxoS ST20 HPE Full Copper 240mm
Water pump: Alphacool DC-LT 2 - 2600rpm Ceramic - 12V DC
Reservoir: Alphacool Eisstation 40 DC-LT
Fittings: EK-Quantum Torque Micro HDC 12
90-degree fittings: EK-Quantum Torque Micro Rotary
Tubing: EK-Loop Hard Tube 12mm 0.5m
Cables: Custom
Wire: Helukabel SiFF 1mm2
Crimps: Molex Mini-Fit 39000078
Crimp extractor: Molex 11030044
Crimpers: Engineer PAD-02
Assembly of components
Realization of mistakes and dread
The computer case was the first component I ordered initially and during the entire build, I was extremely happy with this choice. It's not cheap, shipping from the United States, and customs charges made it quite expensive.
But the quality, how solid the entire thing is, just makes it worth it. When the computer was fully assembled, the total weight is ~8kg. The case shows no warping, flexing, or any other issues that could show with the density of this build.
The first custom component I had to make was wiring. The original wires were too long and too stiff. It took me some time to locate the metal crimps used in the connectors. Those ended up being the Molex Mini-Fit series, and from then on it was relatively easy to choose ones suitable for my wire core and insulation thickness.
Connector housings were a different story, I was unable to locate compatible models with my power supply and mainboard. Connectors I was able to locate had slightly different keying from pictures, which could have been a non-issue if pictures simply were inaccurate, but that still presented some risk. Since I had no use for the original power supply cables, I decided to use housings by simply removing wires from them. The crimp extractor I had from a previous project was extremely helpful.
The entire wiring process took about 8 hours of continuous work, measuring crimping testing and in the end, I still ended up mixing up some wires and ended up dead short on CPU cable. Good thing power supply short circuit protections worked well and nothing melted or caught fire.
All throughout the custom cable wiring process, these guides were extremely helpful:


And the final result I ended up with:




After the cables were done, I had a pump and all flex tubes (temporarily installed) it was ready for the first power-up. Which... went smoothly. But the entire setup was messy and no-where near the end result I'm looking for

By now you can probably see some issues, like the size of the radiator bit too big. The pump taking up a lot of space (even though it's the second one I ordered). Tubes require quite a lot of space so even a 120mm radiator might not fit.
This is where my entire pain train started.
From online documentation of this case, In theory, I could fit a <55mm sandwich of the radiator and fans on the bottom of the case (in the picture it's the top). But 55mm is possible only when using 120mm brackets and that took way too long for me to figure out, as they simply looked the same height out of the box.
I used these Fan brackets, which are taller than AIO brackets, reducing the 55mm of space available by a lot, which made nothing fit.

Throughout the building process, I purchased 3 different radiators and 2 different thickness fans. This could have been prevented by measuring things at the start, but the information I had at hand when ordering the case and looking through some computer case reviews seemed reliable, with no warnings or caveats that people could encounter. This seems like the classical situation of people only sharing their successes and not their failures.
With this figured out, It was clear that I am not going to use any of the pumps/reservoirs I had ordered. I had to look for something tiny. Luckily one build I saw on the internet used Alphacool DC-LT pump with the tiny 40ml reservoir, which in my situation fit just barely above the PSU as the graphics card took up the almost entire length of the case.
2 months into the build, the graphics card water block arrived, but nothing was out of the ordinary. One of the least painful parts of the build. HOWEVER, however inlet and outlet port had the option to have tubes on different sides or on the same side, this means 4 holes, 2 of which have to be plugged. This is a non-issue in standard builds, but in this Mini-ITX did cause large problems.
When plugs are screwed in, they are not flush with the inlet/outlet block, they stick out.
This meant they collided with the case frame and the graphics card was extremely crooked.


EKWB did not offer to sell inlet-outlet block with holes only on one side, so I was up to my own devices.
Some time thinking later, I decided to give try and fill one half of the block with 2 part epoxy. The process was quite simple:
1. Remove plugs
2. Put ducktape on the side which had plugs
3. Mix some epoxy
4. Carefully and slowly pour some epoxy into the holes until the midpoint
5. Wait a couple of days till epoxy cures

After the epoxy was cured, I pressure-tested it. Everything seemed to hold fine.
This entirely fixed the misaligned graphics card issue.
Note: There also is a 3D printable spine that provides extra clearance required for the port plugs, however, I have not tested it. As I wanted to use the original metal part:
https://www.printables.com/model/315818-formd-t1-modified-spine-for-ekwb-3080-tuf
After some tinkering around I got most things to fit. 20 mm thickness radiator plus some 3000 rpm industrial Noctua fans. So cooling won't be an issue if I will need it.


The next steps were to start bending tubes. Everything went... slowly, but nothing out of ordinary:


At this time, before filling up the loop with the liquid I did perform one final leak test using the port on the pump block. That did reveal one fitting not sealing properly, after correcting that no more issues appeared.
Fill up time!


Fill-up is difficult as there is only one port available on the pump block, getting the air out of the system takes a lot of time and requires flipping the case constantly, in combination one cross-brace has to be removed so the structural integrity of the case is compromised.
Doing that slowly and carefully was perfectly fine, as at this point build was in progress for 3 months, so taking an additional hour and not rushing seemed like a reasonable thing to do.
In search of the performance
A.K.A. Overclocking
Everything started before the first parts were ordered. In November 2022 I accidentally came across Actually Hardcore Overclocking youtube channel when doing searching for various computer parts. And it just sucked me in, some videos demonstrated potential basic ram timing adjustments and performance gains. After more than 20 hours spent on this channel and various ram timing guides watched, I felt confident that this is something I must do to my system.
Many many reboots, and factory resets later I ended up with something usable.
In many cases when I did something wrong, the system would just not post, would bluescreen at first chance, or shortly after starting a stress test.
My procedure for initial stability testing was quite primitive:
- Ensure the system does not reset/blue screen or lockup
- Run Y-Cruncher PI 2.5B on all cores
Y-Cruncher was also useful in reporting any memory corruption issues that would not have caused an immediate blue screen or system crash.
Y-Cruncher also had a side effect, since it's used in competitive overclocking as one of the tools to measure performance, I was able to compare my progress in leaderboards. Once I saw that my memory timing adjustments allowed me to have a faster time than some other people with sub-zero cooling on the CPU with liquid nitrogen or dry ice, that just sparked something inside me... I wanted to be first.
After some time spent on the memory timings, the best time I could get was 41.875s.
At the time of benchmark (and writing this page), it's 3rd place in the HWBOT scoreboard.

This overclock was not stable in all situations some loads on a few threads would cause a system reset, blue screen, or lockup. The only change I made to make the system fully stable, is to reduce FCLK to default 2000mHz from the 2033mHz in the picture.
But this increased the average benchmark time to ~43 seconds.
Not in the first place, however amazing improvements over factory configuration, which finishes the benchmark in ~57 seconds.
So fine-tuned timing provides a ~25% reduction in benchmark time.
But this is not the end... Meet the direct die cooling


Thermal grizzly just recently released Ryzen 7000 delidding tool and all the mounting required.
The entire delidding process was kinda nerve-wracking, but I took it slow and in the end, seemed to work out. I still don't know if the CPU is undamaged, as that caused some extra challenges when mounting the water block. VRM heatsink on the motherboard interferes with the water block I had, which meant I had to look for another water block, that's how I ended up with a final choice of EK-Quantum Magnitude AM5 which is narrower by about 10mm, allowing me to clear the motherboard VRM heatsink.


Mounting of the water block currently creates far less pressure on the CPU dies unlike in the original configuration where it presses against the integrated heat spreader. This is something I will likely need to adjust after some tests.
Something I have also done differently is the thermal paste application. Typically it's recommended to apply the thermal paste in lines, dots, crosses, or whatever other methods on the CPU top, but in this case, I have spread it across entire CCDs and IO die, to ensure the entire chip is covered, reference.
The new water block requires new hard lines to be bent, which I'm still waiting to arrive...
Direct die mounting challenges
I used imprints from thermal paste as a reference for where to apply liquid metal, before cleaning off the thermal paste I made guide lines for die locations.


After thermal paste application (and later change to liquid metal) I have noticed a decrease in total CPU power consumption, temperature increase, and lower overall frequency... Loss of performance over stock configuration with integrated heat spreader. This was quite a surprise at first.
After some investigation, an interesting pattern shows up, where cores 7 and 8 on each CCD are the worst performing, and on the first CCD 2,4,6 cores are best, and 1,3,5 on the second CCD are best. Core temperatures were from 50°C to 90°C, that's a 40°C delta between the coolest and hottest core on all core load... not cool.
Here are the core temperatures during Y-Cruncher Pi 2.5 Billion stress test:

Each core was tested to see how it performed at full load. The results are interesting.
The top part very clearly seems to be the worst and the bottom area between CCDs seems to be the best:

Temperature gradient seems to correlate quite closely to cold plate radius ("Quantum Magnitude use the R3000 curvature cold plate by default"), which seemed to help with IHS contact before delidding.
A flat cold plate was ordered from EKWB, once it will be tested, this article will be updated.