-
-
Notifications
You must be signed in to change notification settings - Fork 64
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
--suppress-size still recursively computes file sizes even when sorting is disabled #69
Comments
Yeah I guess this is the opposite problem to what So for context on the traversal and disk usage algorithm:
The reason we need to recur in both steps is because all knowledge of filesystem hierarchies is gone once traverse in parallel. It theoretically increases throughput when dealing with filesystem I/O but then it's up to you to extrapolate your file structure again using the paths of each In a world where This of course is conjecture and I haven't had too much time to consider this super thoughtfully, but this is on my radar. Thanks a bunch for taking an interest in the project and making an issue out of this! I plan to do weekly minor releases for the foreseeable future and depending on how busy I am with work the fix for this will either be out this Sunday or the next. Let me know if you have any thoughts :) |
I'm a simple man. I like pretty colors and file icons. So I installed this to replace tree and exa. I put up an alias to run
et -i -I -s size -l 2
. all nice and pretty.Then I tried running the command at my root directory with and realized that wait, this is a tool that was made to calculate disk usage, not just print pretty trees! It took longer than necessary to just print 2 layers of folders.
So I tried --suppress-size. Still, it took the same amount of time.
So then I thought this was using the size data to possibly sort the files. I checked the source code and, yeah, it holds the data until it tries to sort. But it would be nice™ with
and a little bit of WalkBuilder::max_depth() in TryFrom<&Context> for WalkParallel then account for max_depth < level by setting the file sizes of deeper folders to 0 and hopefully done?
Now this should let you look at the shallow sizes of directories, if ever that's a thing that you need to do.
The text was updated successfully, but these errors were encountered: