My first computer was a PC built from parts thrown out by my dad’s workplace. It was an old tower PC running Windows 2000. Over the years we added parts and replaced others until I bought an Intel Pentium G2020 dual-core at some point making my PC the fastest one at home back then. Multiple years later I got a Laptop for school, but since then I have always had to manually transfer files between my machines. Now (six years later) finally, I got a NAS (network-attached storage) where I can sync all my files to. While this is still a bit of a work in progress, I want to share my setup.
I could have invested in a turnkey-solution like a box from Synology, but this is just not fun and I wanted to see if I could cheap out a bit. The plan was to use parts we already got at home, like a Raspberry Pi 3B+ as the base and my old Laptop SSD in a USB case as the boot and cache drive. The only things missing were the storage drives and USB enclosures to connect them to the Pi. We settled for two 4TB Seagate IronWolf NAS drives running in RAID 1. Additionally, we got a passively cooled case made of aluminum for the Pi. I absolutely love this thing, it makes the Pi look like a mini industrial computer and it cools it pretty well. The drive enclosures on the other hand work ok, but they need to be manually turned on when booting, which was a bit of an oversight.
The setup works like shown above. The Pi is connected to the network via ethernet and to the drives via USB. The large storage drives have their own individual power supplies and the Pi is powered by a basic USB charger.
If you want to build a similar system, check out the guide I wrote with lots of terminal commands to copy and paste. :^)
🐧Installing Ubuntu and Booting
In the past, I made some bad experiences running Pis from an SD-Card over an extended amount of time. They just don’t handle the writing of log files, OS housekeeping and updates very well and die randomly. That is why I wanted to run the OS from a USB attached SSD.
I also wanted to use the Pi's capabilities to the fullest and install a 64bit OS. As I am only a casual Linux pleb I settled on Ubuntu Server, downloaded the image file and burned it on the SSD. The Pi’s USB booting support was already enabled but turned out to be more than flakey. Booting from my thumb drive worked, but it really disliked my SSD. When multiple USB devices were connected it just stopped working at all. When inserting an SD-Card only containing the boot partition with the boot loader and Linux kernel it worked fine again.
The interesting part is, that the SD-Card doesn’t get mounted on boot, making me question if the Pi actually boots from the SD-Card, or the firmware just runs unstable without one present. When interrupting the boot process and entering U-Boot it detects the thumb drive, but I couldn’t tell whether the Pi loaded U-Boot from the thumb drive or SD-Card. When trying the same thing with the SSD the results were even weirder. Without the SD-Card the system refused to boot into Linux, and U-Boot claimed that no USB storage devices are present, although it correctly lists the device. With the SD-Card inserted, booting worked like a charm, but U-Boot still thinks there are no USB devices to be found. My theory is that U-Boot gets loaded from the SD-Card and boots Linux from it, but then Linux mounts the SSD’s partitions as boot and root filesystems ignoring the SD-Card. Nothing a few changes in /etc/fstab could fix.
🔌Connecting the HDDs
After installing the raspi tools for Ubuntu and the rasp-config script, I plugged in the HDDs. When running lsusb -t I realized that all drives were only running in "usb-storage" mode, limiting them to thumb drive speeds. This sent me down the next wild goose chase. I wasn’t sure if the controllers in the HDD enclosures were compatible with the faster USAP mode, which allows the OS to directly communicate with the drive controllers instead. But I definitely knew that my SSD was as I had it connected to an Ubuntu Desktop machine, where it was correctly detected and mounted with USAP. After lots of unfruitful googling and reading dmesg messages I finally found a hint. Turns out the USB driver shipped with the ARM Ubuntu version for the Pi is not capable of "scatter_gather" which seems to be essential for UASP.
🔗RAID and Caching
Setting up the RAID with mdadm worked flawlessly. For caching I used bcache which utilizes a specified partition as a block-level cache and creates a virtual block device like mdadm . I split the SSD into two nearly equal parts of about 100GB each. One partition for booting, one as the cache. After a bit of looking around the web, I found a python script on GitHub that prints the stats in a human-readable format.
Just in case someone ever broke into our house and decided to take the HDDs with them, I wanted to make sure that at least the data on them is safe. I am not a big fan of device-level encryption, after seeing multiple soft locked machines, that were basically paperweights after a failed update or system blue screen. Therefore, I opted for ecryptfs which does directory-level encryption. When enabled it maps a directory containing encrypted files to another mountpoint which serves as a portal where all files can be accessed decrypted. The whole experience is pretty seamless, as long as you use it specifically the way it is intended: Encrypting a folder called ".Private" and mapping it to one called "Private" in a user’s home directory. Just guess what I tried to do without knowing better...
I found a tutorial outlining how to set up ecryptfs in a different directory, but after nearly an hour of failure, I decided to read the source code of the setup tool. Well, things are just hardcoded there, like the user’s home directory as the base folder for example. My solution to this was to create a new user whose home directory is mounted on the cached and raided hard drives.
After every boot this user needs to log in once, to enable the decryption of the files in its home directory.
📂Directory Sharing with Samba
Configuring Samba was pretty quick and after setting a user password, I could immediately navigate to the folder with the Windows File Explorer and add it as a network drive. Up- and downloading, and folder creation also worked like charm. Only my dad’s iPad seems to have problems uploading files, while it still can navigate and create folders. But it looks like this is a bug in the current version of iPad-OS, as people complained about the same problem in the Apple forum very recently.
When starting to transfer some of my files to the network share, I observed that large files like videos or VM disks cause some problems. Looks like Samba just refuses to copy them. Whenever a file larger than 1.5Gb is encountered CPU utilization on one of the Pi's cores jumps to 100% until Windows Explorer kills the copy process.
At first, I wasn’t sure if maybe ecryptfs was the reason for this behavior, as it needed to encrypt such a massive file, but when trying the same thing with SFTP it worked without hiccups. Sadly I am still unsure how to fix this.
As a restart necessitates a login of the user who's home directory is encrypted to make its contents available on the network, I wanted to have a simple web login for that. I am the only one at my place who is comfortable with connecting to the Pi via SSH, but a web login would be a good alternative. Plus, I like the idea of a simple web dashboard. After a few Google searches Cockpit came up as a web-based monitoring GUI with user login. Decrypting and mounting the user directory with a login on Cockpit works out of the box, but the monitoring UIs throw lots of errors at me. This part is still very much a work in progress, as I didn’t have the motivation to look into these error messages yet.