It's very tempting to get an SSD larger than the existing 250GB each time I run out of disk space — thanks, gradle
.
For about the same price, I buy a 2TB spinning external disk, create a disk image of my entire 250GB disk and plonk it on the external drive. Then I erase the SSD and do a fresh OS install, reevaluating almost every piece of software I need.
The 250GB SSD is a feature, not a bug. A larger SSD would take longer to snapshot and require more storage space. Everything I care to actively work on should typically fit in 250GB. Over time, you have old projects just sitting around with their gigabytes upon gigabytes of dependencies/node_modules
/docker images, and other junk. Say, 100GB is for the home folder, including music. 50GB is for the rootfs. At any given point, do I really have projects occupying >100GB of space requiring my active attention? I have a NAS for older projects and systemd-rsync syncing projects to it! All source code is anyway in git repositories on Gitlab. Unless it's an ML project that has large datasets, do I really need a >250GB SSD?
Somewhere in there is a metaphor supporting minimalism.