universal solvent
- 2 Posts
- 153 Comments
traches@sh.itjust.worksto Programmer Humor@programming.dev•There be Gremlins in the CodeEnglish26·25 days agoYeah, that’s my experience. The backend is an environment you control completely and has well-defined inputs and outputs specifically designed to be handled by machines. Front end code changes on a whim, runs who the hell knows where, and has to look good doing it.
traches@sh.itjust.worksto Programmer Humor@programming.dev•Which of these javascript expressions is false?English2·1 month agoIt’s pretty easy to avoid all of these, mostly by using ===. Null being an object is annoying and is one of the reasons ‘typeof’ is useless, but there are other ways to accomplish the same thing.
JavaScript has a lot of foot guns, but it’s also used by literally everyone so there is a lot of tooling and practice to help you avoid them.
traches@sh.itjust.worksto Programming@programming.dev•GitHub is introducing rate limits for unauthenticated pulls, API calls, and web accessEnglish144·1 month agoProbably getting hammered by ai scrapers
traches@sh.itjust.worksto Linux@lemmy.ml•Is it safe to upgrade to paid version of a distro if I'm dual booting?English12·1 month agoThat’s a lot of the reason you buy it, but RHEL is a paid product that you buy copies of.
https://www.redhat.com/en/technologies/linux-platforms/enterprise-linux/how-to-buy#online
traches@sh.itjust.worksto Linux@lemmy.ml•Is it safe to upgrade to paid version of a distro if I'm dual booting?English19·1 month agoYou haven’t heard of red hat? Or Ubuntu pro?
traches@sh.itjust.worksto Selfhosted@lemmy.world•3-2-1 Backups: How do you do the 1 offsite backup?English16·1 month agoNAS at the parents’ house. Restic nightly job, with some plumbing scripts to automate it sensibly.
Shout out to nushell for building an entire shell around this idea!
Have you considered karakeep (formerly hoarder)? It does all of this really well - drop it a URL and it saves a copy. Has lists & tagging (can be done by AI if you want), IOS & android apps as well as browser extensions that make saving stuff super easy.
traches@sh.itjust.worksOPto Selfhosted@lemmy.world•Incremental backups to optical media: tar, dar, or something else?English1·2 months agoBroadly similar from a quick glance: https://www.amazon.pl/s?k=m-disc+blu+ray
traches@sh.itjust.worksOPto Selfhosted@lemmy.world•Incremental backups to optical media: tar, dar, or something else?English1·2 months agoMy options look like this:
https://allegro.pl/kategoria/nosniki-blu-ray-257291?m-disc=tak
Exchange rate is 3.76 PLN to 1 USD, which is actually the best I’ve seen in years
traches@sh.itjust.worksOPto Selfhosted@lemmy.world•Incremental backups to optical media: tar, dar, or something else?English1·2 months agoI only looked how zfs tracks checksums because of your suggestion! Hashing 2TB will take a minute, would be nice to avoid.
Nushell is neat, I’m using it as my login shell. Good for this kind of data-wrangling but also a pre-1.0 moving target.
traches@sh.itjust.worksto Selfhosted@lemmy.world•Self-Hosted podcast has announced that episode 150 is their last.English17·2 months agoTailscale deserves it, bitcoin absolutely does not
traches@sh.itjust.worksOPto Selfhosted@lemmy.world•Incremental backups to optical media: tar, dar, or something else?English2·2 months agoWhere I live (not the US) I’m seeing closer to $240 per TB for M-disc. My whole archive is just a bit over 2TB, though I’m also including exported jpgs in case I can’t get a working copy of darktable that can render my edits. It’s set to save xmp sidecars on edit so I don’t bother with backing up the database.
I mostly wanted a tool to divide up the images into disk-sized chunks, and to automatically track changes to existing files, such as sidecar edits or new photos. I’m now seeing I can do both of those and still get files directly on the disk, so that’s what I’ll be doing.
I’d be careful with using SSDs for long term, offline storage. I hear they lose data if not powered for a long time. IMO metadata is small enough to just save a new copy when it changes
traches@sh.itjust.worksOPto Selfhosted@lemmy.world•Incremental backups to optical media: tar, dar, or something else?English1·2 months agoI’ve been thinking through how I’d write this. With so many files it’s probably worth using sqlite, and then I can match them up by joining on the hash. Deletions and new files can be found with different join conditions. I found a tool called ‘hashdeep’ that can checksum everything, though for incremental runs I’ll probably skip hashing if the size, times, and filename haven’t changed. I’m thinking nushell for the plumbing? It runs everywhere, though they have breaking changes frequently. Maybe rust?
ZFS checksums are done at the block level, and after compression and encryption. I don’t think they’re meant for this purpose.
traches@sh.itjust.worksto Selfhosted@lemmy.world•Self-Hosted podcast has announced that episode 150 is their last.English48·2 months agoAww, man, I’m conflicted here. On one hand, I’ve enjoyed their work for years and they seem like good dudes who deserve to eat. On the other, they’re AI enthusiast crypto-bros and that’s just fucking exhausting. I deal with enough of that bullshit at work
Edit: rephrase for clarity
traches@sh.itjust.worksto Selfhosted@lemmy.world•Is selfhosting your Girlfriend a good idea? 😂English3·2 months agohumans are neat
traches@sh.itjust.worksOPto Selfhosted@lemmy.world•Incremental backups to optical media: tar, dar, or something else?English1·2 months agoYeah, you’re probably right. I already bought all the stuff, though. This project is halfway vibes based; something about spinning rust just feels fragile you know?
I’m definitely moving away from the complex archive split & merge solution.
fpart
can make lists of files that add up to a given size, andfd
can find files modified since a given date. Little bit of plumbing and I’ve got incremental backups that show up as plain files & folders on a disk.
traches@sh.itjust.worksOPto Selfhosted@lemmy.world•Incremental backups to optical media: tar, dar, or something else?English3·2 months agoOhhh boy, after so many people are suggesting I do simple files directly on the disks I went back and rethought some things. I think I’m landing on a solution that does everything and doesn’t require me to manually manage all these files:
fd
(and any number of other programs) can produce lists of files that have been modified since a given date.- fpart can produce lists of files that add up to a given size.
xorrisofs
can accept lists of files to add to an iso
So if I
fd
a list of new files (or don’t for the first backup), pipe them intofpart
to chunk them up, and then pass these lists intoxorrisofs
to create ISOs, I’ve solved almost every problem.- The disks have plain files and folders on them, no special software is needed to read them. My wife could connect a drive, pop the disk in, and the photos would be right there organized by folder.
- Incremental updates can be accomplished by keeping track of whenever the last backup was.
- The fpart lists are also a greppable index; I can use them to find particular files easily.
- Corruption only affects that particular file, not the whole archive.
- A full restore can be accomplished with rsync or other basic tools.
Downsides:
- Change detection is naive. Just mtime. Good enough?
- Renames will still produce new copies. Solution: don’t rename files. They’re organized well enough, stop messing with it.
- Deletions will be disregarded. I could solve this with some sort of indexing scheme, but I don’t think I care enough to bother.
- There isn’t much rhyme or reason to how fpart splits up files. The first backup will be a bit chaotic. I don’t think I really care.
- If I
rsync -a
some files into the dataset, which have mtimes older than the last backup, they won’t get slurped up in the next one. Can be solved by checking that all files are already in the existing fpart indices, or by just not doing that.
Honestly those downsides look quite tolerable given the benefits. Is there some software that will produce and track a checksum database?
Off to do some testing to make sure these things work like I think they do!
Yeah, when someone is interested in switching I always advise them to sort out their apps first. Many Linux applications also run on windows, the reverse is rarely true.