Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The Log Detective backups eat 100G #187

Open
praiskup opened this issue Oct 30, 2024 · 1 comment
Open

The Log Detective backups eat 100G #187

praiskup opened this issue Oct 30, 2024 · 1 comment

Comments

@praiskup
Copy link
Member

praiskup commented Oct 30, 2024

The proposal from @jpodivin was to avoid the redunancy (if no new updates happen in the database, do not store additional tarballs, or do /bin/hardlink with reproducible tarballs).

We can also find -mtime +100 -delete or alike.

@github-project-automation github-project-automation bot moved this to Needs triage in CPT Kanban Oct 30, 2024
@nikromen nikromen moved this from Needs triage to In 2 years in CPT Kanban Oct 30, 2024
@jpodivin
Copy link
Collaborator

I have a set of flags we can use to get reproducible tars. We can apply them, followed by hardlink invocation to cut down the storage use.

  --sort=name --format=posix
  --pax-option=exthdr.name=%d/PaxHeaders/%f
  --pax-option=delete=atime,delete=ctime
  --numeric-owner --owner=0 --group=0
  --mode=go+u,go-w --mtime=@0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: In 2 years
Development

No branches or pull requests

2 participants