Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

Post History

66%
+2 −0
Q&A How to find big files and directories?

If you want to go with command-line tools, the first step might be to run a du -h --max-depth=1 | sort -h -k 1 -r in your root directory. This will give you a list of the sub-directories sorted...

posted 2mo ago by AdminBee‭  ·  edited 2mo ago by AdminBee‭

Answer
#3: Post edited by user avatar AdminBee‭ · 2024-10-02T08:10:30Z (about 2 months ago)
Mention graphical tool
  • If you want to go with command-line tools, the first step might be to run a
  • ```
  • du -h --max-depth=1 | sort -h -k 1 -r
  • ```
  • in your root directory. This will give you a list of the sub-directories sorted by gross usage. You may then proceed by applying the same command inside the sub-directory identified as the "heaviest", to ultimately find the files that eat up all your storage space.
  • Note that for a "complete picture" you may need to do this with `sudo` privileges to access directories that your regular user may not enter. Also keep in mind that there are limitations to the accuracy of disk usage reported by `du`, as explained e.g. [in the Wiki page](https://en.wikipedia.org/wiki/Du_(Unix)).
  • If you want to go with command-line tools, the first step might be to run a
  • ```
  • du -h --max-depth=1 | sort -h -k 1 -r
  • ```
  • in your root directory. This will give you a list of the sub-directories sorted by gross usage. You may then proceed by applying the same command inside the sub-directory identified as the "heaviest", to ultimately find the files that eat up all your storage space.
  • Note that for a "complete picture" you may need to do this with `sudo` privileges to access directories that your regular user may not enter. Also keep in mind that there are limitations to the accuracy of disk usage reported by `du`, as explained e.g. [in the Wiki page](https://en.wikipedia.org/wiki/Du_(Unix)).
  • If you want to go for a graphical tool, there is the [`QDirStat`](https://github.com/shundhammer/qdirstat) project which will represent the individual files as tiles (with sizes proportional to their ... size).
#2: Post edited by user avatar AdminBee‭ · 2024-10-02T07:58:00Z (about 2 months ago)
Minor rewording
  • If you want to go with command-line tools, the first step might be to run a
  • ```
  • du -h --max-depth=1 | sort -h -k 1 -r
  • ```
  • in your root directory. This will give you a list of the sub-directories sorted by gross usage. You may then proceed by applying the same command to the sub-directory identified as the "heaviest", to ultimately find the files that eat up all your storage space.
  • Note that for a "complete picture" you may need to do this with `sudo` privileges to access directories that your regular user may not enter. Also keep in mind that there are limitations to the accuracy of disk usage reported by `du`, as explained e.g. [in the Wiki page](https://en.wikipedia.org/wiki/Du_(Unix)).
  • If you want to go with command-line tools, the first step might be to run a
  • ```
  • du -h --max-depth=1 | sort -h -k 1 -r
  • ```
  • in your root directory. This will give you a list of the sub-directories sorted by gross usage. You may then proceed by applying the same command inside the sub-directory identified as the "heaviest", to ultimately find the files that eat up all your storage space.
  • Note that for a "complete picture" you may need to do this with `sudo` privileges to access directories that your regular user may not enter. Also keep in mind that there are limitations to the accuracy of disk usage reported by `du`, as explained e.g. [in the Wiki page](https://en.wikipedia.org/wiki/Du_(Unix)).
#1: Initial revision by user avatar AdminBee‭ · 2024-10-02T07:56:40Z (about 2 months ago)
If you want to go with command-line tools, the first step might be to run a
```
du -h --max-depth=1 | sort -h -k 1 -r
```
in your root directory. This will give you a list of the sub-directories sorted by gross usage. You may then proceed by applying the same command to the sub-directory identified as the "heaviest", to ultimately find the files that eat up all your storage space.

Note that for a "complete picture" you may need to do this with `sudo` privileges to access directories that your regular user may not enter. Also keep in mind that there are limitations to the accuracy of disk usage reported by `du`, as explained e.g. [in the Wiki page](https://en.wikipedia.org/wiki/Du_(Unix)).