Tiernan's Comms Closet

Geek, Programmer, Photographer, network egineer…

Currently Viewing Posts Tagged hardware

Some Random links for Prime Day 2023

Well, it’s Prime Day 2023, so I have been busy ordering some stuff, and, well, given everyone and their mother is doing posts on Prime Day stuff, I thought I would add my list of interesting things, including some of the things I bought. PS: all links are affiliate links and were found in the UK Store, but links are using GeniusLink to redirect you to the best store for you… Some items in the UK store might not be found in the US or other stores…

So, first, the things I bought:

Now, for things I don’t need (or can’t afford…) but caught my eye while browsing.

Of the stuff I have ordered so far, the Ryzen 5 machine should arrive tomorrow, so I hope to do some sort of unboxing on my YouTube channel. Maybe head over and subscribe while your waiting!

Day 53 of #100daysofhomelab

Day 53 of and It’s been a busy week… ish… I’ve been battling with Vertigo on and off this week, so haven’t don’t a lot. I did, however, fix some issues with the network, set up a proper failover WAN connection using SmoothWAN and my Quad 2.5Gb Box, and have started making major changes to GodBoxV3.

Originally, GodBoxV3 had all spinning disks (8 8Tb drives shucked from WD My Book 8TBs or 8TB Seagate IronWolf) in a single RAID 5 pool in Windows Storage Spaces. Then the NVMe drives were a second pool (5 of them, 4 Force MP510 480Gb NVMe SSDs on a Hyper M.2 x16 card and a 5th unbranded one of a 1X PCI-E add-in card) and a third pool of 2 960Gb IronWolf SSDs.

I deleted the RAID5 and NVMe arrays, and now, for testing, I have spun up a TrueNAS Core VM on the 2 SSDs and passed the NVMe and HDDs into that VM. Windows can still “see” them, but they are marked as offline, but Hard Disk Sentinel and CrystalDiskInfo can both see the SMART status of them (TrueNAS cant, weirdly…). Then, I have 7 of the 8 drives added to a single pool (one is failing so I left it out, this is for testing currently, anyway) and then the 5 NVMes are added to a second pool (the plan is to use the 4 Force MP510s or replacement drives as a single pool, then the other NVMe (or maybe even 2) as a Cache or Log for the Spinning disk pool).

So far, doesn’t matter if I am using the NVMe or HDD pool, but speeds from my Mac (with a 10Gb Thunderbolt adapter) are around the same… Might be a config issue, might be the odd NVNe drive slowing the rest down… but I am happy with the speeds so far… I have seen 3-400Mb/s Writes and 900+ Reads on both NVMe and HDD… Most of that is probably cached… the VM has 64Gb RAM given to it, and the test file was only 5Gb (BlackMagic Disk Speed Test). More testing is required though.

Day 50 of #100daysofhomelab

Day 50 of (this was stuck in a draft folder, so this is a couple of weeks old… I decided to recycle this as day 50, but it was originally day 37 or something…).

Just about 13 days ago: After running ZFS on my Mac for a few hours, I removed it and installed a trial of the SoftRAID software… I am not sure what was going on, but with ZFS installed, my machine just kept crashing… less than an hour and bang… So, I installed SoftRAID, and the speed ok… Not massive speeds, but not 100% sure I am using the right cables… More testing with cables soon…But in reality, this is software RAID 5 over 5 spinning disks. 270Mb/s read ain’t bad… 115Mb/s write ain’t great, but it’s RAID 5…

Cut to today: The trial of SoftRaid is just about up, and I am not sure I am going to buy it… I have been thinking of installing Proxmox or TrueNAS on GodBoxV3, which already has 8 8Tb Spinning drives, 7 NVMe drives (2 in RAID 0 for boot, and 4 in RAID 5 (ish, Windows Storage Spaces) one not usable for some reason, along with 2 960Gb SSDs). If I use the 5-bay enclosure with GodBoxV3, I can use that as one pool (External) the 8 Spinning disks inside as a second pool, the MVMEs as a third, and the SSDs either as a cache to the 8 internal disks, or possibly a more different pool… But this is something I am still thinking about… Anyway, links to random stuff are below…

UntitledImage

day 44 of #100daysofhomelab

Day 44 of and still quite busy with $DayJob… so some links below. 

Also, I have been fiddling with some JQ and Zerotier-CLI commands… Not finished, but working on trying to get some data out of the CLI… I have a GitHub Gist with some details… I plan on adding to it over time.

Day 35 of #100daysofhomelab

Day 35 of and I have been trying to clean up some stuff for my Mac Book Pro. I have an external enclosure from Yottamaster that has 5 3.5” bays and connects via USB C (USB 3.1). I got 5 8TB Seagate IronWolf drives in there. Currently, I have it set up as RAID 10 with 16Tb usable, which is named Archive, with 1 extra drive non-protected 8Tb drive. The details on setting up RAID 10 on MacOS is in the links section. I was looking at using RAID 5 for the archive pool, but the only option that seems to be available is SoftRAID but it’s USD250 for a license unless you have an OWC enclosure… Given the enclosure cost me that much in the first place, I think I will keep with RAID 10 for the while… RAID 5 would, potentially, give me 32Tb usable on my Archive, but 250 is a bit steep… for now…

IMG 1794

I also have a Sabrent USB 3.0 4 Bay 2.5” enclosure with 4 500Gb Samsung SSDs, named SCRATCH. This is in RAID 0 (I know, I know, if one drive goes MIA, all data is lost… That’s why this is a TEMP folder! It’s backed up to the Archive and also to BackBlaze). This is mostly stuff that is downloaded, and Video work that, when completed, is moved to the Archive Folder. Anyway, files are currently moving, so I will leave that as is.

On an update for the RB5009, It was originally planned for today, but the daddy found a TV show on Netflix, so it will have to be done either this evening or tomorrow morning… We will see… Anyway, some links:

day 4 of #100daysofhomelab

Day 4 of #100daysofhomelab and I am still reading the docs I posted yesterday on Kubernetes. I hope to get something sorted this weekend… On a different note, I posted a new YouTube video on the iODD ST400, linked below. This is a follow-up to my iODD Mini review I did a couple of years back. Hopefully, I will have a second video with some speed tests and a better walk in the next few days… hopefully.

Update: I think I am going to have to get my i7 with 6 2.5Gb Ethernet ports and one of the R720s up and running soon… I am running out of memory on my Proxmox cluster.