Getting a Synology and My Past NAS Experience

Synology DS920+

In July 2022, I gave myself a belated birthday present of sorts by pulling the trigger on a longtime wishlist item. I finally acquired network-attached storage, which I’ve been coveting for years. While my wallet is now much lighter, I believe it’s worth the expense to have my own file server with multiple applications. This post chronicles my experience with the Synology DS920+ and what I’ve learned so far with it in the first few months, as well as my past history with data storage. Perhaps you can learn what to expect in case you wish to get your own NAS.

Backing up your data is a major consideration once you’ve suffered data loss via natural disaster, house fire, theft, or just your hard drive giving up the ghost. Regular users may not think if it as important at first, especially now with cloud storage services with convenient data syncing and other features. But if you’re dealing with a lot more data than just occasional vacation photos, then you’ll have to either pay annual subscription fees or buy a bunch of hard drives.

Reasons for getting network-attached storage include data redundancy, the ability to access data with multiple devices in the same network without hassle, and being able to host the data on your own without having to pay subscription fees. While cloud storage is convenient, it can get pretty expensive and even dicey if the service ever gets hacked (which is a certainty nowadays). A NAS lets you take the matter into your own hands.

My History with Network-Attached Storage

My first experience with network-attached storage was a Seagate Central with 3TB of storage, which I bought for ₱9,000 from PCWORX Gilmore a decade ago, back when that amount of money was a very significant expense for me. Seriously, that was a lot of damage.

That’s why when it finally gave up the ghost after around five years of being constantly on, I was gutted. I then learned how to retrieve my files from its proprietary file system and reformatted it to serve as a download drive while tossing the rest of the unit in the trash, only for the drive itself to also kick the bucket over a year later.

The Seagate Central had DLNA for media sharing and other conveniences that made it talk to my other devices, so I was quite happy with it while it was still alive. However, it turned out that it was well-known for being a crappy storage solution that might as well have had an expiry date stamped on it.

Regardless, it introduced me to the convenience of network-attached storage. I dreamed of having a better NAS with redundancy, and I planned to build my own.

Why I Opted Out of Building My Own NAS

For years, I looked up DIY NAS builds and researched FreeNAS (now TrueNAS CORE), Unraid, and other NAS platforms. All that tinkering seemed exciting at first. However, I also feared that I’d mess everything up and end up losing my files.

I already had another important drive die on me, and I’m still looking around for a data recovery service that can restore my files. I wasn’t getting any younger, so the tinkering wasn’t as appealing since I have other stuff going on in my life now.

That’s why upon finally having the money to get a proper NAS with RAID and substantial capacity, I opted for the Synology DS920+. Not only is it a ready-to-use NAS, it also has the Synology ecosystem that has let me do things I didn’t know were possible with a NAS.

Getting a Proper Synology NAS

The Synology DS920+ 4-bay NAS goes for around ₱29,000. I filled it with four 8TB Seagate IronWolf drives for around ₱13,000 each. I then upgraded it with a 4GB DDR4 SODIMM 2666MHz stick of RAM for under ₱900 to double its memory, as well as two sticks of 500GB NVMe m.2 SSDs for under ₱5,000 to serve as SSD cache.

It all totalled just a bit under ₱87,000 — a very substantial expense, around the same price point as a whole new computer. It practically is a computer since it can run applications and Docker containers as well.

I bought my hard drives one by one as I didn’t want to blow my wad by buying all four at once, then receiving defective units. Knowing my luck, that was likely. However, the problem with this approach is that once you’ve built the RAID array, adding more hard drives would mean you have to wait for them to get added into the array. Seriously, you have to wait for days or even a whole week until it’s done.

Synology RAID Resync Speed Limit Settings

You can speed it up by adjusting RAID resync speed limits in Storage Manager from the recommended setting of “Lower the impact on overall system performance” to “Run RAID resync faster”. Take note that if you do so, the NAS itself will run slower, so you won’t be able to move files to it or do other stuff. Maybe you can still stream media from it in the meantime, but you won’t be able to do much else.

Perhaps that’s the drawback of having a NAS that runs on a Celeron processor (even if it’s a quad core). But the point is that you’re exchanging performance for power efficiency, which is more important when it comes to a file server that will likely run 24/7. The last thing you’d want is to build a DIY NAS running on a full desktop processor and realize too late that it may be a bit too power hungry when you receive your next electricity bill.

That’s why it’s recommended to have all hard drives prepared right then and there so that you don’t have to go through the trouble of waiting a lot longer. Besides, you can always start with smaller hard drives, then upgrade to larger ones later on. The Btrfs file system that’s used by Synology allows for upgrading to larger capacity drives without losing data. Therefore, once I do get errors for one of the drives I have now, I have the option of replacing it with a bigger drive in the future. Hopefully, I’ll be able to afford twice the capacity by then.

Synology also has expansion units that can be connected to the NAS through the eSATA port in the back. That is a likely avenue for me in the future if I ever need more storage and don’t want to just buy a whole new NAS that I’ll have to juggle with the Synology NAS. I like having one platform that manages multiple things for simplicity and homogeneity, so an expansion unit is something I may obtain in the future, or I could also get a NAS designed for SSDs.

Considering Other Data Archival Options

Before getting a NAS, regular users have the following options for archiving data (as far as I know):

  1. Cloud storage
  2. More hard drives
  3. DAS (Direct-attached Storage)
  4. Non-RAID NAS
  5. Burning discs

Each has its own advantages and disadvantages. What they all have in common is that they’re all more affordable than getting a full NAS with data redundancy, but are also all inferior in capacity and data security as a result of being more affordable options.

I didn’t include stuff like tape storage and such because I’m talking about regular users. Most people have neither the time nor patience to archive their family vacation photos and videos in tape drives. I also didn’t include stuff like Time Machine because I don’t know much about them.

The common theme among them is the necessary expense. The reason why you haven’t considered a file server yet at this point is because you don’t have an extra $2,000 or so to spare on a NAS with RAID and hard drives to fill it up.

Cloud Storage

The most obvious method these days is cloud storage. Google Drive, Microsoft OneDrive, Dropbox, MEGA, and so on are readily available for backing up and sharing files. There are also services that specialize in online backup for home and business use, such as Carbonite, BackBlaze, IDrive, CrashPlan, and so on.

They have different free features and subscription models, and it’s possible to have an account in each one so you can have a good amount of cloud storage without paying a cent. Of course, the biggest disadvantage is that you won’t get enough storage by keeping it free.

Perhaps the most accessible is a Google One subscription that expands your storage to 100GB and offers other features that may be useful if you’re a regular user of Google’s online suites. That would be sufficient for most people and the ₱89 ($1.99) monthly charge isn’t too heavy.

Mind you, 100GB is not enough if you’re archiving video files or large photo albums. At best, you may be able to hold up to five projects worth of data before you have to delete old stuff to make room for newer stuff on a regular basis. That 100GB is best for backing up stuff you’ll need later on and/or sharing files to other people, both of which I do with my Google Drive.

You also need to have enough reliable bandwidth to be able to upload your files to the cloud drive. If you have to upload on a daily basis, and yet you’re stuck with a sub-100Mbps connection, then you’re going to have a bad time, especially if you have to do other things with that connection like streaming and content creation.

Then again, if you don’t happen to have a good connection in the first place, then you likely don’t have enough money to spare for paid cloud storage.

More Hard Drives

This is likely the next most common solution most people have these days, especially for occupations that handle gigabytes of data everyday like photographers, filmmakers, content creators, and so on. There are some who have trays and even filing cabinets full of hard drives.

You can get a 4TB external drive for around ₱5,000 these days, so you can conceivably buy one every couple of months or so whenever you need extra space to store your crucial files. Then again, for many people, buying hard drives is a low priority expense with bills and other crucial expenses. Another major point of concern with hard drives is their failure rate, which is why we’re having this discussion about data backups.

On one hand, all hard drives are bound to fail one day, and it’s only a question of when it will happen, depending on how hard and often it has to read and write and how much vibration it has to deal with during use. On the other hand, I own a 500GB Buffalo shockproof drive that was bought for me way back in late 2008 and is still alive to this day.

DAS (Direct-attached Storage)

Imagine a NAS like the Synology, but it only has USB connection and doesn’t have ethernet connection. That’s what you’d call a DAS or direct-attached storage, which is the next best thing to a NAS.

If you have a bunch of hard drives, but don’t have the cash and/or know-how to buy or build your own NAS, then you could go for something like an Orico 5-bay hard drive enclosure or a Mediasonic Probox enclosure (if you can still find one of those) and put your hard drives in that.

You can then plug that into your PC via USB (hopefully 3.0 at least) and have a JBOD just like that. Or if you want to have RAID, you can use Windows Storage Spaces to create a RAID array that will add some form of redundancy to your DAS, thus further securing your data.

If you have a thin client or mini-PC lying around, you can use that with Windows Storage Spaces to make your own RAID array and connect that to your network for a makeshift NAS. Just note that since that enclosure is connected via USB, you effectively have one point of failure. If that USB connection gives up the ghost, you can lose the entire thing.

I was thinking of getting this before I opted for the Synology instead. I erred on the side of caution and spent a whole lot more instead of going for this cheaper option because I didn’t want to experience losing my data due to stupidity once again.


These are essentially external drives with ethernet or wi-fi connectivity to make them accessible to a local network on its own. My old Seagate Central was one such product, and it worked pretty well for a good while. The problem with such devices is that you’ll likely keep it on 24/7, which will make them more likely to fail within a few years.

That’s exactly what happened to me with the Seagate Central. It was especially painful for me when mine finally failed since ₱9,000 was a big chunk of money for me at that time. But without that experience, I would’ve never understood how good it is to have a NAS with features like DLNA and mobile apps that let you use it with all of your devices.

I get to access my files while pooping. The first time I got to do that, it was like my third eye had been opened.

There are cheap devices now that let regular users have network-attached storage in their own home. Some routers even allow users to plug in an external drive to them for remote access, but they’re either too clunky or too slow. These devices are not recommended as while they provide the same convenience, they don’t provide the same quality.

Burning Discs

You can also do things the old-fashioned way and burn your files into discs. However, even if you have a Blu-ray writer (like the one I recently purchased), the most a BD-R disk can hold is 25GB and double that for a dual-layer disk. That may seem a lot at first, but then you realize that it’d take 40 single-layer discs or 20 dual-layer discs to hit one terabyte. Also, burning that much data into them would be a day-long task.

We now have m.2 drives the size of a stick of gum that can have a whole terabyte of data copied into them in no time thanks to NVMe. You can even buy enclosures that can turn m.2 SSDs into USB flash drives, which I’ve also done recently.

Meanwhile, a DVD-R, it’s just 4.7 gigabytes and double that for a dual layer disc — paltry in today’s standards. It might as well be like how floppy diskettes were in the early 2000s when USB flash drives started to become a thing. I remember having to use 3.5-inch floppy disks in high school because USB drives only became a thing here in the Philippines when I was already in college. Anyway, there’s a good reason why CD-R King is no longer a booming business, and the branches that remain don’t even sell CD-R’s anymore.

At best, burning discs can still be a good way to archive crucial but very seldomly-accessed data. I sure as hell wish I thought of doing that with stuff I’ve since lost when my first 4TB hard drive gave up the ghost. I still have it and am looking for a data recovery service that can restore those files. Apparently, the read and write heads in it are totally broken and the first data recovery service I brought it to couldn’t do anything about it.

Optical media is certainly not a primary backup method, but it’s still useful for archiving media that won’t need to be accessed all the time. If I would’ve been more diligent in burning my video archive into discs, I wouldn’t be beating myself up right now for not having a second backup and potentially spending tens of thousands of pesos to have my files recovered.

Remember the 3-2-1 Backup Rule (If You Can Afford It)

The 3-2-1 backup rule is as follows:

Have at least:

  • 3 separate backups
  • in at least 2 different storage media
  • and have at least 1 backup offsite (in a different physical location)

While a sensible rule of thumb, it does have a fairly high cost. Imagine having to buy hard drives, solid state drives, flash drives, cloud storage subscriptions, and so on. Most people don’t have an extra chunk of change to buy storage in multiple formats to back up their files, which is why horror stories of lost data are commonplace.

There’s also the issue of having an offsite backup. If you have a safety deposit box in a bank or a storage unit somewhere, that’s your go-to. You can also keep it in your workplace, although that has its own risks. You can also have relatives keep your backups for you, and that also has its own risks. In my case, for instance, the relative route is out of the question due to the nature of our familial relationships (or lack thereof).

You should keep the 3-2-1 backup rule in mind, but not have to follow it to the letter. As long as you have multiple backups of critical data, you should be good for the most part unless something truly catastrophic happens. The 3-2-1 backup rule was created to consider all possibilities and eventualities. After all, it’s better to be meticulous than lose everything in a house fire. You can’t predict the future, but you can prepare for it.

That also means having a NAS is not enough, even with data redundancy. If all of your critical data is in one place, there’s always a possibility of losing it in one go, no matter how safe it seems. I think that’s proof that makes optical and removable media still relevant for archival purposes as there should always be room for more ways to back up data.

Got Feedback?

Have something to say? Do you agree or am I off-base? Did I miss a crucial detail or get something wrong? Please leave whatever reactions, questions, or suggestions you may have in the comment section below.

You may also like/follow and leave a message on either Facebook or Twitter. Please subscribe to both the YouTube channel and my personal YouTube channel, as well as my Twitch channel, for more content. Thank you for dropping by.