[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
shred: Option to use random but entire files as source
From: |
Nikita Zlobin |
Subject: |
shred: Option to use random but entire files as source |
Date: |
Sun, 5 Jan 2020 12:11:44 +0500 |
I know, shred allowes to choose any file to take random bytes from it.
However, there is still one issue, which I found both in articles and
rumors about secure erasing: the fact itself of using such data removal
could be sensitive, and with current approaches (using any of random,
patterns or zero data) it is non-trivial to hide.
Thus, another approach came to my mind - use entire files or their
relatively chunks to overwrite. Following options could present:
- path/file set, allowed as source - siilar to most file search
indexers like locate, baloo or tracker;
- chunk size range, min and max. If max is unset (e.g. =-1), it would
mean unlimited maximum size;
- In addition to first - may be have some "weight" kind for different
paths (even inside anothers), affecting their final priority.
Sure it looks pretty complex, but imho - what yet ever could least
suspicious, than it unwanted area are "uncarefully" get overwritten
during usual file copy/backup ops, due to some fs "bug" :).
[Prev in Thread] |
Current Thread |
[Next in Thread] |
- shred: Option to use random but entire files as source,
Nikita Zlobin <=