Just fucking putting data into json files in a folder is so much easier than databases. Sure, databases are faster and give you access to powerful SQL queries. But if you’re just making something small and simple, you don’t need those things. Save yourself the hassle and just use the filesystem.
Or use minio/s3, which can either be the best or the worst of both worlds depending on your usecase.
It also makes you re-implement a database, but worse.
Use a JSON file if you just need to serialize/deserialize some data. Use SQLite or a DB server if you need more. Your own code will never match the quality of SQLite.
Made a bash script using sqlite recently to automate processing some data. Best part is it terrifies everyone else at work and no one else wants to touch it because it’s on Linux and none of them have used it before.
Just fucking putting data into json files in a folder is so much easier than databases. Sure, databases are faster and give you access to powerful SQL queries. But if you’re just making something small and simple, you don’t need those things. Save yourself the hassle and just use the filesystem.
Or use minio/s3, which can either be the best or the worst of both worlds depending on your usecase.
The trouble is, filesystems don’t have ACID. The first time you have a power loss during a write, your data will get corrupted.
Hmm that’s a valid criticism, thanks for pointing it out
Duckdb can query them with SQL like they are in a database. Csv, tsv, parquet also. You can even connect to and query postures and cloud storage also
It also makes you re-implement a database, but worse.
Use a JSON file if you just need to serialize/deserialize some data. Use SQLite or a DB server if you need more. Your own code will never match the quality of SQLite.
Sqlite is the best in most cases
Made a bash script using sqlite recently to automate processing some data. Best part is it terrifies everyone else at work and no one else wants to touch it because it’s on Linux and none of them have used it before.