Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have actually made a Raspberry Pi based NAS and found it was a pain.

The SATA controller isn't terrible, but it and other hardware areas have had many strange behaviors over the years to the point of compiling the kernel being needed to fiddle with some settings to get a hardware device to do what it's supposed to.

Even if you're using power that is well supported eventually you seem to hit internal limits and get problems. That's when you see people underclocking the chip to move some of this phantom power budget to other chips. Likewise you have to power most everything from a separate source which pushes me even closer to a "regular PC" anyhow.

I just grab an old PC from Facebook for under $100. The current one is a leftover from the DDR3 + Nvidia 1060 gaming era. It's a quad core with HT so I get 8 threads. Granted most of those threads cause the system to go into 90% usage even when running jobs with only 2 threads, probably because the real hardware being used there is something like AVX and it can't be shared between all of the cores at the same time.

The SATA controller has been a bit flaky, but you can pick up 4-port SATA cards for about $10 each.

When my Raspberry Pi fails I need to start looking at configurations and hacks to get the firmware/software stack to work.

When my $100 random PC fails I look at the logs to find out what hardware component failed and replace it.



> The SATA controller has been a bit flaky, but you can pick up 4-port SATA cards for about $10 each.

If your build allows the extra money for an LSI or real raid controller is well worth it. The no-name PCI-e sata cards are flakey and very slow. Putting an LSI in my NAS was a literal 10x performance boost, particularly with zfs which tends to have all of the drives active at once.


> If your build allows the extra money for an LSI or real raid controller is well worth it.

Keep in mind if you get a real RAID controller and want to use ZFS, you probably want to ensure it can present the disks to the OS as JBOD. Sometimes this requires flashing a non-RAID firmware.

That said, slightly older LSI cards are quite cheap on eBay, and the two I've bought have worked perfectly for many years.


I am curious about the slow part. I use these crappy SATA cards and I am sure they are crappy, but the drives are only going to give 100MB/s in bursts and they have an LVM cache (or ZFS stuff) on them to sustain more short-term writes.

I get if I was wiring up NVME drives that are going to go 500MB/s and higher all the time.

What I really care about with the SATA and what I mean by flaky is I when I have to reboot a system physically every day because the controller stays on in some way even if it gets a soft `reboot` command and then Linux fills up with IO timeouts because the controller seems to stop working after X amount of time.


> probably because the real hardware being used there is something like AVX and it can't be shared between all of the cores at the same time.

That's not the right explanation; each physical core has its own vector ALUs for handling SSE and AVX instructions. The chip's power budget is shared between cores, but not the physical transistors doing the vector operations.


Thank you for the correction.


I don't know about TrueNAS, but with Proxmox the two random 10$ SATA-cards I tried only gave me issues. With first one OS wouldn't boot, second seemed to work fine, but connected drives disappeared as soon as I wrote to them.

Used server-grade LSI cards seem to be the way to go. Too bad they're power hungry based on what I've read.


i have had a random $10 sata card, has worked fine over the last 5 yrs


I like the power efficiency of the Raspberry Pi. Prior to it, I used a Macbook that sipped power like 11 W.


I do too. For many use cases it's awesome to use an ESP32, Raspberry Pi or Arduino when I want to stash some little widget that can sip a small battery over the next week. It's equally awesome that in many scenarios you can be net positive in your consumption with a super simple solar panel that provides a few watts here and there into a battery.

But at home things are different. While I want to use as little power as possible, the realistic plan for being sustainable at home is to use solar with batteries. That's a plan I actually think can matter and that I am able to participate in for relatively low cost ($10k)

Messing around with a system to save a few watts for me in this context isn't very valuable.


It depends how hard it is. I installed OpenMediaVault on my Pi and it's worked great ever since. I don't think any other option would have resulted in less work for me.

As far as solar goes, where I live it's not worth getting, so I can't comment there. Raspberry Pi costs $5 a year to run. A 60 Watt computer costs $79 a year to run. I agree that if it's very difficult to setup a Pi it might be worth the extra cost for the computer, but the energy savings are pretty incredible.


What's the power usage like?


Yeah this is what keeps me from considering old PCs for NAS.

Maybe stating the blindingly obvious but seems like there is a gap in the market for a board or full kit with a high efficiency ~1-10W CPU and a bunch of SATA and PCIe ports.


Then you've got to consider what are you optimizing for. Is the power bill going to be offset by the cost of a Pi plus any extras you need, or a cheap second hand PC someone wants to clear out, or free if you can put an old serviceable PC you have already back into use. Is it heat? Noise? Space that it needs to hide away in? Airflow for the location?


https://www.minisforum.com/pages/n5_pro

Minisforum sort of working on it, I'd imagine the AMD "AI" processors are pretty low power at idle as they're mobile chips. Obviously has the downsides of other minipcs tho (high cost, low expandability)


That one's quite expensive, but there are others from Ugreen and Aoostar with less powerful CPUs.

I think they generally idle at 5-10W, like modern laptop parts do in general.


I've been eyeing off the Radxa ROCK 5 ITX with the Rockchip RK3588. There are two variants, one gives you 4x SATA, the other gives you 1x PCIe.


There’s also the Orion O6 if you need more IO/perf - https://radxa.com/products/orion/o6#techspec


I'm using a retired 4790k build for mine, idles at 60W and I really need to do something about that.


Probably not very good. I selected large spinning hard drives because I could get them at a good price for 2TB each and I wanted to setup a RAID5-like system in ZFS and btrfs (lesson learned, btrfs doesn't actually support this correctly) and I wanted to get at least 10TB with redundancy.

I don't know how much each of those SATA disks take up, but probably more than a single Raspberry Pi does.

Likewise it has a few case fans in it that may be pointless. I would prefer it never has a heating issue versus saving a few "whurrr" sounds off in a closet somewhere that nobody cares about.

It's also powering that Nvidia 1060 that I do almost nothing with on the NAS. I don't even bother to enable the Jellyfin GPU transcoding configuration because I prefer to keep my library encoded in h264 right now anyhow as I have not yet made the leap to a newer codec because the different smart TVs have varying support. And sometimes my daughter has a friend come over that has a weird Amazon tablet thing that only does a subset of things correctly.

The 1060 isn't an amazing card really, but it could do some basic Ollama inference if I wanted. I think it has 6GB of memory, which is pretty low, but usable for some small LLMs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: