Hacker News new | past | comments | ask | show | jobs | submit login

Thanks for the feedback! Agreed Git can be used to sync your notes. Its a great solution for those comfortable putting their notes into a Git repo like Github. I wasn't comfortable with that however.

Currently vetting a way to sync my database files with my markdown files on my laptop, so it functions similar to Obsidian. I enjoy Vim too much to work constrained to Directus' markdown editor!




It's not just git. You have the plugins available for S3, couchdb, FTP, MongoDB, cloud drives, rsync, syncthing, and probably every other storage/protocol in the world. And they're all available for free in obsidian.


Common ways to sync Obsidian are through cloud tools (Google Drive, OneDrive, etc.), SyncThing Fork or Git.

I'd recommend you to look into SyncThing Fork or a similar tool if you never want your notes to leave your own server.

I wrote about ways to sync Obsidian here: https://bryanhogan.com/blog/how-to-sync-obsidian


If you need multiplayer sync, I've been working on a plugin that makes Obsidian real-time multiplayer (like google docs) called Relay [0]. A lot of our users use it in combination with SyncThing to keep it entirely free for up to 3 users (we also offer a paid plan with attachment storage and more seats).

[0] https://relay.md


Git is decentralised. You can sync between laptop and phone directly, no third party server required.


To be clear, GitHub is centralized, but Git is not. You can sync between laptop and phone directly with Git -- no third party server required.


How do you use git on a phone? I haven't found an easy way to pull to my phone and open Obsidian files.


No one said anything about GitHub… git is perfectly fine for this use case and 100% private.


What about git makes you uncomfortable?

I saw that you didn’t want to use a 3rd party provider, but why not stick a git repo on your VPS (which you are trusting with your data today) and use that to coordinate syncs between your client devices?


Made a comment in the thread explaining this https://news.ycombinator.com/item?id=44023090

I expect my PKMS to evolve and wouldn't rule out a self-hosted Git server if I find it's a better option long term.


> wouldn't rule out a self-hosted Git server

I don't think you really get it. Git is distributed. There's no need for "a git server". You already have a machine on which you host the SQL database, you can just use that as yet another git remote.


Thanks for the reply. I do agree with sibling comment from tasuki that I think you’re missing the simpler solution of plain git repos to solve “owning your own data in a future-proof manner”.

If you’re not trying to coordinate work among multiple people, and aren’t trying to enforce a single source of truth with code, you don’t _need_ “git server” software. You just need a git repository (folder & file structure) in a location that you consider to be your source of truth.

I’m not trying to convince you to change it now, especially if you’re happy with what you have, but I would suggest reading some (or all) of https://git-scm.com/book/en/v2

I think the first ~4 subsections of chapter 4 cover what I & tasuki were suggesting could be sufficient for you. If you’re the type of engineer to read through the code of your data storage layer, I think you’d find Chapter 10 (Git Internals) interesting, because it can demystify git. I enjoyed the whole book.

As with any engineering project, I see lots of questions about your choices, and I applaud you for sticking around. I would make very different decisions than you, based on your stated priorities, but that’s okay.


You only really need SSH access on a box to use it as a git remote - no server needed.

I learnt this quite late and was not obvious to me so hope it's helpful for you too.


The odd part here is why take it to 100%+ when you can just build a plugin on Obsidian rather than re-building the whole thing? Seems a bit extreme.


In 20 years will that plugin work? I doubt it.


You can’t even compile stuff from 20 years ago without some extensive archeological efforts. I doubt this is your largest problem by then.


???

I have dozens of projects from 20 years ago that I can compile today.


With respect, I doubt it. Have you tried pulling out a 20 year old tarball and compiling it, without modification, on a modern distro?


I recently unearthed something that I thought was 20 years old when someone asked me about it. I checked and it was only 14 years old based on mtime (thought I suspect I started the project nearly 20 years ago). Another project I unearthed for a different reason was only 13 years old by mtime (again, it was started before that). I must concede that I haven't actually recently compiled and used anything that was untouched for 20 years.

I should note that the first program I wrote that was actually used for a purpose (it calculates energy based on an internal stopwatch and then typing in values from a volt and ammeter, for a science project in 1992) still works in qb64 today.

The second program I wrote that was actually used for a purpose assumes a parallel-port printer on DOS that uses a fairly old version of PCL, and was written in 16-bit C, so probably won't work today.


A lot of these things can be made to work. That isn't being contested. But if you take a random piece of code of the internet from 20 years ago, it very likely won't compile out of the box on a modern system.

For example, I just took the oldest version of openssl I could find with a quick search (2015, so only 10 years old), and it fails to compile on my Mac. It detects macOS/darwin, and then proceeds to compile for 32-bit Intel, which obviously doesn't work. OpenSSL has fallbacks for straight C implementation to support platforms that haven't been customized, but their build scripts assume that macOS = Intel.

Ok sure, changing the whole freaking CPU architecture will bork a build script. So to prove a point I just downloaded v2.6.11 of the Linux kernel (released in 2005), unpacked (this time on Ubuntu 24.04 on real Intel), and did a `make menuconfig && make`. Of course I don't expect a 20 year old kernel to run on modern hardware, but could I compile it? No, I could not: modern GCC forces PIC by default, which parts of the Linux kernel do not support. I was able to fix that by editing the makefile to pass `-fno-pic` in CFLAGS. Then I get hit with another error due to "multiple definitions" of functions that are declared slightly differently. Turns out old GCC didn't warn about this, but modern GCC handles these declarations differently. This is after pages upon pages of warnings, btw, with only a few source files compiled so far.

I gave up. This is what is meant by archeology required: for anything nontrivial you often have to build the environment in which the code was originally compiled in order to get it to compile again.


I have lots of stuff that's 20 years old that still builds ... C, Perl, shell scripts ...


A shell script is something very different from a markdown editing app with plugins, file synchronisation, multi-platform support, and many more moving parts. And even a 20 year old shell script is probably going to fare pretty poorly.

Do you remember what the most common processor was in 2005? A Pentium 4, or a Celeron maybe. That was when 64 bit operating systems just became a thing. I’d really like to see you getting a version of, say. OpenSSL from 2005 to compile on modern hardware…


"A shell script is something very different"

So what? Maybe read the thread that you're responding to.

"And even a 20 year old shell script is probably going to fare pretty poorly."

You can't just say "my sweeping assertion is probably right". I have shell scripts that are more than 20 years old that still run. And I have C89 programs that still compile and run.

"I’d really like to see you getting a version of, say. OpenSSL from 2005 to compile on modern hardware…"

This is a ridiculous disingenuous strawman. I can't get my FORTRAN II programs that I wrote in 1965 to run, but that has nothing to do with the original claim that I responded to, which was a sweeping generalization that a SINGLE counterexample refutes.

Over and out.


This is why I didn't like Obsidian, half the plugins I tried didn't work despite them being in the top 20 downloaded ones. Meanwhile I'll use like 15 year old emacs plugins that haven't been updated in like 5 years and they'll work fine (I think org-diary or something along those lines was what I tried).


In 20 years you might be dead.


Directus is not eternal either. They are OSS, but you can't support it yourself forever. For a such a long run this looks like a controversial choice for me.


your ai will straight up write you the plugin if it hadnt already done that seamlessly when you requested it render your file.


Some people just enjoy the process, and you'll always learn something new


GitHub is not git




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: