STFN

Preparing for my future house: building a 2U NAS from scratch

10 minutes

As I already mentioned several times, I am in the process of building my future house. One of the things I look to having the most, is a separate technical room. It will be mostly for things like the heat pump and the electrical box, but it will also have the space for a server rack. Which means I will be able to move my whole homelab, both the networking and computers, into a single place.

This also means that I can convert my servers to be rack mounted, which makes me very excited, and feeling almost like an adult sysadmin.

I prefer to work in small iterations, see what happens and decide what to do next. That works better for me rather than making grandious plans from A to Z. That’s why I decided to first rackify (is that a word? Let’s make it a word) my small, backup server, see how it goes, what are the issues, etc, and later on, with new experience, work on transferring my main NAS/server to a rack case.

And I think it was a very good decision, based on what happened with this build.

The initial state

I built my the first generation of my backup server mostly from parts leftover from earlier iterations of other computers.

BTW, I still find it fascinating that I remember the times that there was the Single Family Computer In The Computer Place, and now I have several different compute devices in all shapes and sizes, and a box of surplus computers parts. But I digress.

How it looked at the beginning.

The motherboard was some old Socket 1150 with a Haswell i5-4460, 8GB of RAM in total. There was also a Mellanox X-3 10G SFP+ network card which I have not used, but bought for future use. The boot drive was some random Silicon 128GB SSD, and the main storage drive was a 3TB HGST 3.5” HDD. The PSU was some basic 500W Seasonic.

I wrote about that server in more detail in my State of my homelab in June 2025 blog post.

Since the beginning, the aim of that server was what I like to call “lukewarm storage”, the intermediate between cold and hot storage. Meaning I would start it every few days or weeks, depending on how much changed on the main NAS, copy the files and scrub the disks.

This was the basis for moving to a rack case, and I thought it will be a quick and simple move. Yeah.

Moving to a rack case, first try

For the rack case I went with a Netrack 2U 390mm case

I chose this one because it was cheap, and the description said that it can fit an ATX PSU and a mATX motherboard, so I assumed that I can just move the existing components from one case to another. It also has good cooling of the drives, which is a must for me. The reviews were mixed, but I yolo’d it, and made the order.

The case arrived, and looked sturdy, made from thick metal. The overall build quality is okayish, there are some rough edges, but it’s not meant to be a show piece, it will live in the technical room.

The first thing I did was to replace the two provided front fans. The ones that came with the case were MOLEX fans without any PWM. I replaced them with a pair of basic Arctic 80mm.

Then came the time to assemble the rest of the component, and this is where things became very interesting.

First iteration of the server fully installed.

The first problem was that when using an mATX motherboard, there is space only for two drives, as the bottom brackets of the disk caddies are too close to the motherboard. This was the less important issue, as I had only two drives in total.

The bigger problem was the PSU.

Again, the description of the case that an ATX PSU would fit inside.

And that was technically correct. And well, the PSU did fit, but it was flush with the full metal sides of the case, leaving absolutely no space for ventilation, as can be seen in the photos.

PSU sitting flush with the case

Enclosed from all sides.

I installed and connected everything, and run the server with the case opened and the PSU not screwed down. The server indeed run but it was not a reasonable solution.

And then I went for holidays.

Getting new parts

During the holidays, I made some research and found a solution to my problem. Turns out Chieftec makes a PSU specifically for such a use case, with both inlet and output holes on the smaller sides, allowing it to be placed in 2U encloures.

Chieftec PSF-400B

It was relatively cheap, at 200PLN (~50EUR), and my only hope, so I ordered it towards the end of me being away, so that it would arrive on the day I was back at home.

Also one rainy day I was a bit bored, and I was browsing Vinted, and I found a very good deal on a AM4 motherboard with a Ryzen 2200G CPU and 16GB of RAM. The motherboard looked smaller than that Socket 1150 I had, so I hoped I could fit more drives into the case. The Ryzen was a G, meaning built-in graphics, so I would not need a discrete GPU, just like with that Intel. And ZFS could for sure use some more processing power and more RAM. So I ordered it as well.

Another upside of moving to an AM4 board was that it had an M.2 slot, so I could ditch the 2.5” boot SSD. For the boot drive I did another yolo gamble, and ordered a second hand 32GB Intel Optane M.2 NVMe. I don’t need a lot of space, this PC will be strictly a storage server, no Docker, no services, so 32GB is plenty enough for Ubuntu Server (I went with Ubuntu because of good ZFS support), and from what I’ve read, Optane has greater endurance in comparison to basic NVME SSDs.

The final issue was cooling. The AMD stock cooler is simply too big to fit a 2U case, so I went looking for solutions. And the solution was on Aliexpress, I found there an AM4 server cooler designed specifically for 2U cases. It was 70PLN (~16EUR).

AMD stock cooler goes out.

New cooler goes in.

COOLSERVER!

What is very important is that it is made by the COOLSERVER company, and so confirms that this build is a server. And it is cool.

Second try with new parts

The assembly was straightforward, I remove the previous parts, they will probably be sold, and I installed the new ones.

The PSU, well, let’s just say it does raise much confidence with how it looks. It was sold in “bulk case”, which meant I got in a random gray box wrapped in bubble wrap, there weren’t even screws provided with it. But exactly for such cases I have my Drawer of Random Screws and Bolts

As many other Chieftec products it looks like something made by a blacksmith, and not PC equpiment, but then again, I’ve never had a Chieftec product fail on me. And again, this is not meant to be a looker, but a quiet worker.

And quiet is not the word I would use about that cooler. With its tiny fan, I knew it would be loud. I run it on my desk, connected to a 12V power source through a chain of adapters, and wow, it was like a swarm of angry bees caught in a tornado (that’s why they are angry).

The good news is that the new PSU fits and the cooling holes are right where they need to be.

The bad news is that even though the motherboard is indeed smaller, the way the 24 pin ATX power connector is placed, there is still only space for a single 3.5” HDD. I could buy 5.25 to 3.5” adapters and put the drives in the CD-ROM bay, but they would not have any cooling there. I guess that if I want to put more drives into this case, I will need to switch to a uITX board.

The final touch is a low profile 10G networking card that I got for very cheap. I will test once my Mikrotik CRS305 10G switch arrives.

Second iteration done.

Setting up the server

I managed to do some cable management, connected the Internet, for now through the 1G RJ-45 on the motherboard, keyboard, burned Ubuntu Server on a USB drive (funny to say, “burn on a USB drive”, another leftover from the CD era), and started the newly built server.

And wow, even though the first thing I did was go to BIOS and change the fan curve to “silent”, that CPU fan is awful. I am so glad this PC will be rarely run, and in a separate room. To bear with it running on my desk, I had to use ANC headphones. But it gets the job done, even under load the CPU temps would not go over 45C.

Apart from that, the rest was totally boring. I installed Ubuntu Server, set up ZFS, imported the pool, installed Sanoid and Syncoid, configured the SSH keys, made a short bash script for pull-style replication, and the config was done.

Bottom Line

I’m glad I went with a secondary server first to gather experience. Some of the issues I encountered here will not be a problem with moving the main NAS to a rack mount, as for that one I plan to use a 4U case for the GPU to fit, so I should be able to use a standard PSU.

But now I know that I will need to buy a longer case so that there is space for the drives and a normal motherboard.

If everything goes right, I should move to my new house at the end of this year, or the beginning of the new one.

This has been the first episode of me preparing my homelab for the big move, and I am sure there will be at least a few more before the highly anticipated event.

Thanks for reading!

If you enjoyed this post, please consider helping me make new projects by supporting me on the following crowdfunding sites: