||[Oct. 8th, 2006|10:58 pm]
Did a bunch of work on brackup this weekend:
- make decryption --use-agent and --batch, and help out if env not set
and gpg-agent probably not running
- new --du-stats to command to act like the du(1) command, but
based on a root in brackup.conf, and skipping ignored directories.
good to let you know how big a backup will be.
- walk directories smarter: jump over directories early which ignore
patterns show as never matching.
- deal w/ encryption better: tell chunks when the backup target
will need data, so it can forget cached digest/backlength
ahead of time w/o errors/warnings later.
- start of stats code (to give stats after a backup). not done.
In regards to this previous post about me changing the format: don't worry (all two of you), I'm not. I realized all the metadata I need is (mostly) in the *.brackup backup meta files done at the end of each backup (and stored on the target), so I don't need to introduce more.
Also planned how compression will work for the not-using-gpg case. (but I also should explicitly set some gpg settings when having it do compression)
I'm planning on splitting the Brackup::Chunk class into 2 or 3 distinct-but-related classes.... Brackup::RawChunk, Brackup::StoredChunk, Brackup::ChunkHandle ... realized it's confusing how it is now, trying to pretend the RawChunk -> StoredChunk is a 1:1 mapping when it's not.
Then I want to separate out the "digest database" into separate data structures so they can be maintained and documented separately. For instance, the digest databased used to be called the "digest cache" when all it did was cache the digests of files and chunks of files... I want to get back to that, and the more important stuff move to per-target "What does this target have stored?" databases. And then make brackup have a --rebuild-target-inventory-dbs mode or something.
But it's coming along well. Thinking this all through has taken more time than hacking it out will, so I feel like it's mostly done now. Fortunately I'm getting antsy to get my ~30GB of stuff uploaded, so that'll keep me from getting sidetracked.